Apr 22 17:50:59.094122 ip-10-0-131-69 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 17:50:59.094136 ip-10-0-131-69 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 17:50:59.094172 ip-10-0-131-69 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 17:50:59.094473 ip-10-0-131-69 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 17:51:09.179420 ip-10-0-131-69 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 17:51:09.179442 ip-10-0-131-69 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 5bdc2cb8e7874aaa838926d92e5a549f -- Apr 22 17:53:37.999964 ip-10-0-131-69 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:53:38.429165 ip-10-0-131-69 kubenswrapper[2583]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:38.429165 ip-10-0-131-69 kubenswrapper[2583]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:53:38.429165 ip-10-0-131-69 kubenswrapper[2583]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:38.429165 ip-10-0-131-69 kubenswrapper[2583]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:53:38.429165 ip-10-0-131-69 kubenswrapper[2583]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:38.430735 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.430647 2583 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:53:38.433665 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433651 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:38.433665 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433664 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433668 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433671 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433674 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433678 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433681 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433684 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433695 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433698 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433701 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433705 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433708 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433711 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433714 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433716 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433719 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433722 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433725 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433727 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433731 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:38.433725 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433733 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433737 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433739 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433742 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433745 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433747 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433750 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433754 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433758 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433761 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433769 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433773 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433776 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433780 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433790 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433793 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433796 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433800 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433802 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:38.434199 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433805 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433807 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433810 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433812 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433815 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433818 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433820 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433823 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433825 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433828 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433830 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433833 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433835 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433838 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433840 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433843 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433846 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433849 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433852 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433854 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:38.434701 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433857 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433859 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433862 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433864 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433873 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433875 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433878 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433881 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433883 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433886 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433889 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433891 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433893 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433897 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433899 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433902 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433904 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433907 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433910 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433912 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:38.435200 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433915 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433917 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433920 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433924 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433927 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.433930 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434308 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434313 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434316 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434319 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434322 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434324 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434327 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434330 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434333 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434335 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434338 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434341 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434344 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434347 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:38.435694 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434349 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434352 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434354 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434357 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434359 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434361 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434364 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434367 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434369 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434372 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434374 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434377 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434380 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434382 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434385 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434388 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434391 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434393 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434396 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434400 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:38.436175 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434402 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434405 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434407 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434410 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434413 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434415 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434418 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434420 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434423 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434429 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434431 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434434 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434437 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434439 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434442 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434444 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434447 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434450 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434452 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434455 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:38.436707 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434458 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434460 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434463 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434466 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434469 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434474 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434476 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434479 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434482 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434485 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434488 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434491 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434493 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434496 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434498 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434501 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434503 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434508 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:38.437225 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434512 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434514 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434517 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434519 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434522 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434526 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434529 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434531 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434534 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434536 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434539 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434541 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434544 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.434546 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435316 2583 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435327 2583 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435334 2583 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435339 2583 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435343 2583 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435347 2583 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435352 2583 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:53:38.437763 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435357 2583 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435361 2583 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435364 2583 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435367 2583 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435371 2583 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435374 2583 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435377 2583 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435380 2583 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435383 2583 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435386 2583 flags.go:64] FLAG: --cloud-config="" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435389 2583 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435392 2583 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435397 2583 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435400 2583 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435403 2583 flags.go:64] FLAG: --config-dir="" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435406 2583 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435410 2583 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435414 2583 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435417 2583 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435420 2583 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435424 2583 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435427 2583 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435430 2583 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435433 2583 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435436 2583 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:53:38.438275 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435439 2583 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435444 2583 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435447 2583 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435450 2583 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435453 2583 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435456 2583 flags.go:64] FLAG: --enable-server="true" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435459 2583 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435465 2583 flags.go:64] FLAG: --event-burst="100" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435469 2583 flags.go:64] FLAG: --event-qps="50" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435472 2583 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435475 2583 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435478 2583 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435482 2583 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435485 2583 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435488 2583 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435491 2583 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435494 2583 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435497 2583 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435500 2583 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435503 2583 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435506 2583 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435509 2583 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435512 2583 flags.go:64] FLAG: --feature-gates="" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435516 2583 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435519 2583 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:53:38.438901 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435523 2583 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435528 2583 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435531 2583 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435535 2583 flags.go:64] FLAG: --help="false" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435538 2583 flags.go:64] FLAG: --hostname-override="ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435541 2583 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435544 2583 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435547 2583 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435550 2583 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435554 2583 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435557 2583 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435559 2583 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435562 2583 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435565 2583 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435568 2583 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435572 2583 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435575 2583 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435578 2583 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435581 2583 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435584 2583 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435587 2583 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435590 2583 flags.go:64] FLAG: --lock-file="" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435592 2583 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435595 2583 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:53:38.439495 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435598 2583 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435604 2583 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435607 2583 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435609 2583 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435612 2583 flags.go:64] FLAG: --logging-format="text" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435615 2583 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435618 2583 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435633 2583 flags.go:64] FLAG: --manifest-url="" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435636 2583 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435642 2583 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435646 2583 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435650 2583 flags.go:64] FLAG: --max-pods="110" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435653 2583 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435656 2583 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435659 2583 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435663 2583 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435666 2583 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435669 2583 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435672 2583 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435679 2583 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435682 2583 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435685 2583 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435689 2583 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:53:38.440085 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435691 2583 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435697 2583 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435700 2583 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435704 2583 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435707 2583 flags.go:64] FLAG: --port="10250" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435710 2583 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435713 2583 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-075d3a521ca2f7503" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435717 2583 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435719 2583 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435722 2583 flags.go:64] FLAG: --register-node="true" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435725 2583 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435728 2583 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435732 2583 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435735 2583 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435738 2583 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435740 2583 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435744 2583 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435747 2583 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435751 2583 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435754 2583 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435757 2583 flags.go:64] FLAG: --runonce="false" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435760 2583 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435763 2583 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435766 2583 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435769 2583 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435772 2583 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:53:38.440660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435775 2583 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435778 2583 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435781 2583 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435784 2583 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435788 2583 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435791 2583 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435793 2583 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435798 2583 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435801 2583 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435803 2583 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435809 2583 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435812 2583 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435815 2583 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435819 2583 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435822 2583 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435825 2583 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435828 2583 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435831 2583 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435834 2583 flags.go:64] FLAG: --v="2" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435839 2583 flags.go:64] FLAG: --version="false" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435843 2583 flags.go:64] FLAG: --vmodule="" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435847 2583 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.435851 2583 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435939 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:38.441282 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435943 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435946 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435950 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435953 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435956 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435958 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435961 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435963 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435967 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435969 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435972 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435974 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435979 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435981 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435984 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435987 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435989 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435992 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435995 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:38.441871 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.435997 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436000 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436003 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436005 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436008 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436011 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436014 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436016 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436019 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436022 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436025 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436027 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436030 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436032 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436035 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436038 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436041 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436044 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436046 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436049 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:38.442405 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436051 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436055 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436060 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436063 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436066 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436070 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436073 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436075 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436078 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436080 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436083 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436086 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436088 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436091 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436094 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436096 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436099 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436102 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436107 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:38.443283 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436110 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436113 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436116 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436119 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436123 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436125 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436128 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436131 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436134 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436137 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436139 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436142 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436144 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436147 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436150 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436152 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436155 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436158 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436162 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436165 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:38.443812 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436168 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:38.444296 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436171 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:38.444296 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436173 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:38.444296 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436176 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:38.444296 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436178 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:38.444296 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436181 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:38.444296 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.436184 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:38.444296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.436735 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.444375 2583 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.444392 2583 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444440 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444445 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444448 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444451 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444454 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444457 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444459 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444463 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444465 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444468 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444471 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444474 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444477 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444480 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444483 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444486 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:38.444479 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444489 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444492 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444495 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444497 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444500 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444503 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444506 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444509 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444511 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444514 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444516 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444519 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444521 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444525 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444529 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444533 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444535 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444539 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444544 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:38.444958 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444547 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444549 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444552 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444555 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444557 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444560 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444562 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444565 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444568 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444570 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444573 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444575 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444578 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444581 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444583 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444587 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444590 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444592 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444595 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444598 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:38.445443 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444601 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444604 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444607 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444610 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444612 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444615 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444617 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444620 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444641 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444644 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444646 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444649 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444652 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444655 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444658 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444661 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444664 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444667 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444669 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:38.445951 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444672 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444675 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444678 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444681 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444684 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444686 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444689 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444692 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444695 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444698 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444701 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444703 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.444709 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444802 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444808 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:38.446400 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444811 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444814 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444817 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444820 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444822 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444825 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444828 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444831 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444833 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444836 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444838 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444842 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444846 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444850 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444853 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444856 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444859 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444863 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444866 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444868 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:38.446828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444871 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444874 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444877 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444879 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444882 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444885 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444888 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444890 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444893 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444896 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444898 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444901 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444904 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444906 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444909 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444911 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444914 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444917 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444919 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444922 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:38.447308 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444924 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444927 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444929 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444933 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444936 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444938 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444941 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444943 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444946 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444948 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444951 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444953 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444956 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444959 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444962 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444965 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444967 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444971 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444974 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444976 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:38.447829 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444979 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444981 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444984 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444986 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444989 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444991 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444994 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444997 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.444999 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445002 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445004 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445007 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445010 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445012 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445016 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445020 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445023 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445025 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445028 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445031 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:38.448304 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445033 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:38.448799 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445036 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:38.448799 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445038 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:38.448799 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:38.445041 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:38.448799 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.445046 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:38.448799 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.445696 2583 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:53:38.448799 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.448566 2583 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:53:38.449512 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.449499 2583 server.go:1019] "Starting client certificate rotation" Apr 22 17:53:38.449614 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.449598 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:38.449663 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.449651 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:38.478735 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.478717 2583 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:38.481268 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.481246 2583 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:38.493064 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.493047 2583 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:53:38.501570 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.501552 2583 log.go:25] "Validated CRI v1 image API" Apr 22 17:53:38.503397 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.503382 2583 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:53:38.507025 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.507002 2583 fs.go:135] Filesystem UUIDs: map[2ba7b79c-8945-418b-9188-8f7ae82ac0ce:/dev/nvme0n1p4 499f77fa-5186-4d73-93fd-250d47152eef:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 22 17:53:38.507092 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.507024 2583 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:53:38.512196 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.512082 2583 manager.go:217] Machine: {Timestamp:2026-04-22 17:53:38.510955482 +0000 UTC m=+0.388224641 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099317 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec271a79be54cf78b9277ec18790c384 SystemUUID:ec271a79-be54-cf78-b927-7ec18790c384 BootID:5bdc2cb8-e787-4aaa-8389-26d92e5a549f Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a4:30:1c:ec:45 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a4:30:1c:ec:45 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d6:31:2a:f6:9a:d4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:53:38.512196 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.512191 2583 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:53:38.512318 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.512306 2583 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:53:38.512505 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.512485 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:38.513466 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.513439 2583 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:53:38.513606 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.513468 2583 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-69.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:53:38.513674 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.513615 2583 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:53:38.513674 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.513636 2583 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:53:38.513674 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.513649 2583 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:38.513674 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.513664 2583 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:38.514991 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.514979 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:38.515141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.515133 2583 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:53:38.518295 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.518284 2583 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:53:38.518339 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.518305 2583 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:53:38.518946 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.518935 2583 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:53:38.518976 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.518952 2583 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:53:38.518976 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.518962 2583 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:53:38.520166 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.520151 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:38.520236 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.520174 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:38.523317 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.523296 2583 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:53:38.524667 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.524653 2583 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:53:38.525882 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.525863 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zcrsg" Apr 22 17:53:38.526382 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526366 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:53:38.526382 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526384 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:53:38.526529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526391 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:53:38.526529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526397 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:53:38.526529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526403 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:53:38.526529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526410 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:53:38.526529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526417 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:53:38.526529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526422 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:53:38.526529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526429 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:53:38.526529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526436 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:53:38.526529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526453 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:53:38.526529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.526462 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:53:38.527318 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.527308 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:53:38.527318 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.527318 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:53:38.530969 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.530950 2583 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:53:38.531068 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.530984 2583 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-69.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:53:38.531068 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.530993 2583 server.go:1295] "Started kubelet" Apr 22 17:53:38.531068 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.531008 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-69.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:53:38.531218 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.531055 2583 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:53:38.531218 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.531148 2583 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:53:38.531285 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.531222 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:53:38.531318 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.531298 2583 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:53:38.531828 ip-10-0-131-69 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:53:38.532368 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.532241 2583 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:53:38.533688 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.533673 2583 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:53:38.535269 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.535248 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zcrsg" Apr 22 17:53:38.537751 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.536881 2583 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-69.ec2.internal.18a8bf538ae2b723 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-69.ec2.internal,UID:ip-10-0-131-69.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-69.ec2.internal,},FirstTimestamp:2026-04-22 17:53:38.530965283 +0000 UTC m=+0.408234445,LastTimestamp:2026-04-22 17:53:38.530965283 +0000 UTC m=+0.408234445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-69.ec2.internal,}" Apr 22 17:53:38.538944 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.538793 2583 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:53:38.538944 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.538815 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:38.541325 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.540611 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-69.ec2.internal\" not found" Apr 22 17:53:38.541325 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.540954 2583 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:53:38.541325 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.541203 2583 factory.go:55] Registering systemd factory Apr 22 17:53:38.541325 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.541220 2583 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:53:38.541579 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.541360 2583 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:53:38.541579 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.541374 2583 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:53:38.541579 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.541562 2583 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:53:38.541579 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.541570 2583 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:53:38.541755 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.541670 2583 factory.go:153] Registering CRI-O factory Apr 22 17:53:38.541755 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.541683 2583 factory.go:223] Registration of the crio container factory successfully Apr 22 17:53:38.541755 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.541729 2583 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:53:38.541755 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.541753 2583 factory.go:103] Registering Raw factory Apr 22 17:53:38.541917 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.541769 2583 manager.go:1196] Started watching for new ooms in manager Apr 22 17:53:38.541917 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.541889 2583 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:53:38.542261 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.542245 2583 manager.go:319] Starting recovery of all containers Apr 22 17:53:38.543662 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.543616 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:38.545511 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.545472 2583 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-69.ec2.internal\" not found" node="ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.553431 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.553416 2583 manager.go:324] Recovery completed Apr 22 17:53:38.557457 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.557445 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:38.561568 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.561553 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:38.561642 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.561578 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:38.561642 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.561591 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:38.562114 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.562094 2583 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:53:38.562114 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.562111 2583 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:53:38.562220 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.562132 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:38.564556 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.564544 2583 policy_none.go:49] "None policy: Start" Apr 22 17:53:38.564597 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.564559 2583 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:53:38.564597 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.564569 2583 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:53:38.597232 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.597213 2583 manager.go:341] "Starting Device Plugin manager" Apr 22 17:53:38.601781 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.597274 2583 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:53:38.601781 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.597287 2583 server.go:85] "Starting device plugin registration server" Apr 22 17:53:38.601781 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.597534 2583 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:53:38.601781 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.597547 2583 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:53:38.601781 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.597729 2583 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:53:38.601781 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.597823 2583 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:53:38.601781 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.597834 2583 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:53:38.601781 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.598534 2583 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:53:38.601781 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.598572 2583 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-69.ec2.internal\" not found" Apr 22 17:53:38.659299 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.659266 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:53:38.660521 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.660500 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:53:38.660647 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.660526 2583 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:53:38.660647 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.660549 2583 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:53:38.660647 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.660557 2583 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:53:38.660647 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.660595 2583 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:53:38.663600 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.663584 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:38.698188 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.698145 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:38.699146 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.699129 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:38.699231 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.699160 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:38.699231 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.699172 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:38.699231 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.699192 2583 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.708084 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.708066 2583 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.708148 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.708088 2583 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-69.ec2.internal\": node \"ip-10-0-131-69.ec2.internal\" not found" Apr 22 17:53:38.724273 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.724253 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-69.ec2.internal\" not found" Apr 22 17:53:38.761418 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.761387 2583 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-69.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal"] Apr 22 17:53:38.761483 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.761468 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:38.763035 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.763017 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:38.763102 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.763045 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:38.763102 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.763060 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:38.764528 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.764515 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:38.764663 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.764649 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.764700 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.764675 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:38.765487 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.765472 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:38.765557 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.765503 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:38.765557 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.765472 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:38.765557 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.765537 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:38.765557 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.765547 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:38.765557 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.765518 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:38.766849 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.766836 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.766890 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.766863 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:38.767534 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.767522 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:38.767597 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.767540 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:38.767597 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.767551 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:38.796336 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.796315 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-69.ec2.internal\" not found" node="ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.801370 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.801354 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-69.ec2.internal\" not found" node="ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.825123 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.825109 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-69.ec2.internal\" not found" Apr 22 17:53:38.843373 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.843353 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5240e71df4ae79d7d950c1a36bd685b5-config\") pod \"kube-apiserver-proxy-ip-10-0-131-69.ec2.internal\" (UID: \"5240e71df4ae79d7d950c1a36bd685b5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.843443 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.843381 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5997c5e5e00da4f1b4bc8591f471ca46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal\" (UID: \"5997c5e5e00da4f1b4bc8591f471ca46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.843443 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.843399 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5997c5e5e00da4f1b4bc8591f471ca46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal\" (UID: \"5997c5e5e00da4f1b4bc8591f471ca46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.925421 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:38.925364 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-69.ec2.internal\" not found" Apr 22 17:53:38.943785 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.943751 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5997c5e5e00da4f1b4bc8591f471ca46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal\" (UID: \"5997c5e5e00da4f1b4bc8591f471ca46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.943892 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.943788 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5997c5e5e00da4f1b4bc8591f471ca46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal\" (UID: \"5997c5e5e00da4f1b4bc8591f471ca46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.943892 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.943814 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5240e71df4ae79d7d950c1a36bd685b5-config\") pod \"kube-apiserver-proxy-ip-10-0-131-69.ec2.internal\" (UID: \"5240e71df4ae79d7d950c1a36bd685b5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.943892 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.943863 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5240e71df4ae79d7d950c1a36bd685b5-config\") pod \"kube-apiserver-proxy-ip-10-0-131-69.ec2.internal\" (UID: \"5240e71df4ae79d7d950c1a36bd685b5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.943892 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.943865 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5997c5e5e00da4f1b4bc8591f471ca46-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal\" (UID: \"5997c5e5e00da4f1b4bc8591f471ca46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" Apr 22 17:53:38.943892 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:38.943865 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5997c5e5e00da4f1b4bc8591f471ca46-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal\" (UID: \"5997c5e5e00da4f1b4bc8591f471ca46\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" Apr 22 17:53:39.026218 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:39.026135 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-69.ec2.internal\" not found" Apr 22 17:53:39.098684 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.098658 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-69.ec2.internal" Apr 22 17:53:39.104253 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.104236 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" Apr 22 17:53:39.126824 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:39.126803 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-69.ec2.internal\" not found" Apr 22 17:53:39.227413 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:39.227370 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-69.ec2.internal\" not found" Apr 22 17:53:39.327971 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:39.327892 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-69.ec2.internal\" not found" Apr 22 17:53:39.428592 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:39.428558 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-69.ec2.internal\" not found" Apr 22 17:53:39.449062 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.449047 2583 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:53:39.449649 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.449172 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:39.449649 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.449201 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:39.527213 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.527189 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:39.537158 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.537129 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:48:38 +0000 UTC" deadline="2027-10-14 00:31:37.423008707 +0000 UTC" Apr 22 17:53:39.537158 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.537155 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12942h37m57.885856753s" Apr 22 17:53:39.540382 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.540364 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:39.540457 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.540380 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-69.ec2.internal" Apr 22 17:53:39.550780 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.550758 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:39.553069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.553056 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" Apr 22 17:53:39.555410 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.555394 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:39.566102 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.566086 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:39.578815 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.578766 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-j78zp" Apr 22 17:53:39.582833 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:39.582807 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5240e71df4ae79d7d950c1a36bd685b5.slice/crio-9bea76248c07f43d02ecf5c0bd3041ced439499223242774e26dabf01a671c2f WatchSource:0}: Error finding container 9bea76248c07f43d02ecf5c0bd3041ced439499223242774e26dabf01a671c2f: Status 404 returned error can't find the container with id 9bea76248c07f43d02ecf5c0bd3041ced439499223242774e26dabf01a671c2f Apr 22 17:53:39.583341 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:39.583328 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5997c5e5e00da4f1b4bc8591f471ca46.slice/crio-0d90cde477d7dd748ab265f51bb42ad1c2bee55e0a70f8003aa7e6f9b3dca7db WatchSource:0}: Error finding container 0d90cde477d7dd748ab265f51bb42ad1c2bee55e0a70f8003aa7e6f9b3dca7db: Status 404 returned error can't find the container with id 0d90cde477d7dd748ab265f51bb42ad1c2bee55e0a70f8003aa7e6f9b3dca7db Apr 22 17:53:39.587808 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.587794 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:53:39.587994 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.587978 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-j78zp" Apr 22 17:53:39.663830 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.663598 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" event={"ID":"5997c5e5e00da4f1b4bc8591f471ca46","Type":"ContainerStarted","Data":"0d90cde477d7dd748ab265f51bb42ad1c2bee55e0a70f8003aa7e6f9b3dca7db"} Apr 22 17:53:39.664544 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.664523 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-69.ec2.internal" event={"ID":"5240e71df4ae79d7d950c1a36bd685b5","Type":"ContainerStarted","Data":"9bea76248c07f43d02ecf5c0bd3041ced439499223242774e26dabf01a671c2f"} Apr 22 17:53:39.735691 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:39.735666 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:40.519491 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.519457 2583 apiserver.go:52] "Watching apiserver" Apr 22 17:53:40.527441 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.527406 2583 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:53:40.528635 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.528596 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-kpz56","openshift-multus/network-metrics-daemon-xk988","openshift-ovn-kubernetes/ovnkube-node-t6kbb","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt","openshift-cluster-node-tuning-operator/tuned-56xcg","openshift-image-registry/node-ca-7p29d","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal","openshift-network-diagnostics/network-check-target-p96l2","openshift-network-operator/iptables-alerter-96thx","kube-system/konnectivity-agent-xj964","kube-system/kube-apiserver-proxy-ip-10-0-131-69.ec2.internal","openshift-dns/node-resolver-qrk67","openshift-multus/multus-additional-cni-plugins-vh7jd"] Apr 22 17:53:40.530684 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.530657 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:40.531892 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.531868 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:40.531986 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:40.531947 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:53:40.533141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.533079 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.533141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.533111 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:53:40.533283 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.533172 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fhsjw\"" Apr 22 17:53:40.533283 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.533195 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:53:40.534252 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.534233 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.535652 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.535339 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:53:40.535652 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.535381 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:53:40.535652 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.535398 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:40.535652 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:40.535457 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:53:40.535938 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.535663 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fb6nv\"" Apr 22 17:53:40.535938 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.535705 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:53:40.536676 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.536655 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:40.536761 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.536714 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:40.536819 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.536759 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-94bz7\"" Apr 22 17:53:40.536960 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.536940 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-96thx" Apr 22 17:53:40.539321 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.539126 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.539881 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.539540 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:40.540243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.540096 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:40.540483 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.540462 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:53:40.540829 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.540749 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hqmcr\"" Apr 22 17:53:40.542848 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.542815 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qrk67" Apr 22 17:53:40.543017 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.542999 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.544299 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.544278 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:53:40.544465 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.544446 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:53:40.544571 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.544497 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:53:40.544571 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.544558 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-589vt\"" Apr 22 17:53:40.544702 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.544657 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:53:40.544947 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.544929 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.545214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.545194 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:53:40.545667 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.545649 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:53:40.545667 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.545660 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:53:40.545820 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.545702 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:53:40.545820 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.545716 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sg8sc\"" Apr 22 17:53:40.545820 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.545733 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-556xx\"" Apr 22 17:53:40.546592 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.546574 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7p29d" Apr 22 17:53:40.547905 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.547774 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:53:40.547905 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.547788 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-99lft\"" Apr 22 17:53:40.547905 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.547774 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:53:40.547905 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.547803 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:53:40.547905 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.547806 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:53:40.548936 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.548913 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:53:40.549038 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.548950 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:53:40.549038 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.549016 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:53:40.549271 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.549255 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:53:40.549333 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.549285 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ghrmk\"" Apr 22 17:53:40.551277 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551256 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-modprobe-d\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.551365 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551293 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-sysconfig\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.551365 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551319 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.551365 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551356 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-device-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.551522 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551381 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-system-cni-dir\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.551522 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551402 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-os-release\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.551522 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551438 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-cni-binary-copy\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.551522 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551465 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j6gw\" (UniqueName: \"kubernetes.io/projected/0921ce28-1383-4534-bfc2-4751014a996a-kube-api-access-8j6gw\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.551522 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551495 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21690994-61ae-4434-baa1-49a8adf56490-host-slash\") pod \"iptables-alerter-96thx\" (UID: \"21690994-61ae-4434-baa1-49a8adf56490\") " pod="openshift-network-operator/iptables-alerter-96thx" Apr 22 17:53:40.551522 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551516 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-multus-cni-dir\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.551772 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551538 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:53:40.551772 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551543 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-var-lib-kubelet\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.551772 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551601 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-os-release\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.551772 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551683 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-kubernetes\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.551772 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551731 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-sys\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.551772 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551756 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9f4638bd-d27c-475b-9ecd-d0faa1ba55d2-hosts-file\") pod \"node-resolver-qrk67\" (UID: \"9f4638bd-d27c-475b-9ecd-d0faa1ba55d2\") " pod="openshift-dns/node-resolver-qrk67" Apr 22 17:53:40.551991 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551786 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6xsk\" (UniqueName: \"kubernetes.io/projected/c266f31e-39da-4b15-a687-f1304c2e67b7-kube-api-access-s6xsk\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:40.551991 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551820 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-var-lib-cni-bin\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.551991 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551860 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-var-lib-cni-multus\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.551991 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551901 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-cnibin\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.551991 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551930 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f625cc6f-1340-411a-b28d-19e397e691de-cni-binary-copy\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.551991 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.551970 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-run\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.552173 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552010 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/661612b0-cef9-4d3b-ad68-e9507dc62d38-konnectivity-ca\") pod \"konnectivity-agent-xj964\" (UID: \"661612b0-cef9-4d3b-ad68-e9507dc62d38\") " pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:40.552173 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552064 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-cnibin\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.552173 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552095 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-run-k8s-cni-cncf-io\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.552173 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552124 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-hostroot\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.552173 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552148 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-var-lib-kubelet\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.552349 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552173 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-systemd\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.552349 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552198 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qtn6\" (UniqueName: \"kubernetes.io/projected/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-kube-api-access-7qtn6\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.552349 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552224 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn752\" (UniqueName: \"kubernetes.io/projected/21690994-61ae-4434-baa1-49a8adf56490-kube-api-access-qn752\") pod \"iptables-alerter-96thx\" (UID: \"21690994-61ae-4434-baa1-49a8adf56490\") " pod="openshift-network-operator/iptables-alerter-96thx" Apr 22 17:53:40.552349 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552249 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.552349 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552272 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:40.552349 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552298 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-registration-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.552349 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552322 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88wxb\" (UniqueName: \"kubernetes.io/projected/047f2672-ea3f-4aac-a645-3bbf7fa9342c-kube-api-access-88wxb\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.552349 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552346 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f625cc6f-1340-411a-b28d-19e397e691de-multus-daemon-config\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552371 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-system-cni-dir\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552412 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552443 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552471 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-sys-fs\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552494 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-run-multus-certs\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552517 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-sysctl-conf\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552541 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f4638bd-d27c-475b-9ecd-d0faa1ba55d2-tmp-dir\") pod \"node-resolver-qrk67\" (UID: \"9f4638bd-d27c-475b-9ecd-d0faa1ba55d2\") " pod="openshift-dns/node-resolver-qrk67" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552567 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/21690994-61ae-4434-baa1-49a8adf56490-iptables-alerter-script\") pod \"iptables-alerter-96thx\" (UID: \"21690994-61ae-4434-baa1-49a8adf56490\") " pod="openshift-network-operator/iptables-alerter-96thx" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552594 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-etc-kubernetes\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552618 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-lib-modules\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552660 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5c4j\" (UniqueName: \"kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j\") pod \"network-check-target-p96l2\" (UID: \"e510245b-cd68-4ee7-8f7d-e72ddbd61118\") " pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552700 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-run-netns\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.552738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552723 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27rx8\" (UniqueName: \"kubernetes.io/projected/f625cc6f-1340-411a-b28d-19e397e691de-kube-api-access-27rx8\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.553243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552746 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-sysctl-d\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.553243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552778 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-host\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.553243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552798 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0921ce28-1383-4534-bfc2-4751014a996a-etc-tuned\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.553243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552841 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-socket-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.553243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552876 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-multus-conf-dir\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.553243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552929 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/661612b0-cef9-4d3b-ad68-e9507dc62d38-agent-certs\") pod \"konnectivity-agent-xj964\" (UID: \"661612b0-cef9-4d3b-ad68-e9507dc62d38\") " pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:40.553243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552954 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-etc-selinux\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.553243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.552981 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-multus-socket-dir-parent\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.553243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.553008 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0921ce28-1383-4534-bfc2-4751014a996a-tmp\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.553243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.553032 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcwvl\" (UniqueName: \"kubernetes.io/projected/9f4638bd-d27c-475b-9ecd-d0faa1ba55d2-kube-api-access-zcwvl\") pod \"node-resolver-qrk67\" (UID: \"9f4638bd-d27c-475b-9ecd-d0faa1ba55d2\") " pod="openshift-dns/node-resolver-qrk67" Apr 22 17:53:40.588598 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.588568 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:39 +0000 UTC" deadline="2028-02-02 10:21:03.308897256 +0000 UTC" Apr 22 17:53:40.588710 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.588601 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15616h27m22.720301449s" Apr 22 17:53:40.641723 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.641694 2583 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:53:40.653702 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.653676 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-slash\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.653702 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.653705 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-run-systemd\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.653868 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.653721 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-ovnkube-config\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.653868 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.653747 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c4j\" (UniqueName: \"kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j\") pod \"network-check-target-p96l2\" (UID: \"e510245b-cd68-4ee7-8f7d-e72ddbd61118\") " pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:40.653868 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.653773 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-run-netns\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.653969 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.653889 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-run-netns\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.653969 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.653932 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27rx8\" (UniqueName: \"kubernetes.io/projected/f625cc6f-1340-411a-b28d-19e397e691de-kube-api-access-27rx8\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.653969 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.653962 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-sysctl-d\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.654107 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.653980 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-host\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.654107 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.653996 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0921ce28-1383-4534-bfc2-4751014a996a-etc-tuned\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.654107 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654011 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-socket-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.654107 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654026 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-multus-conf-dir\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.654107 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654047 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-var-lib-openvswitch\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.654107 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654062 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-host\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.654107 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654074 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-run-openvswitch\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.654107 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654099 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-run-ovn\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654108 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-multus-conf-dir\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654121 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b56ca0f7-5e42-4d61-9c1d-fff86d2affdd-host\") pod \"node-ca-7p29d\" (UID: \"b56ca0f7-5e42-4d61-9c1d-fff86d2affdd\") " pod="openshift-image-registry/node-ca-7p29d" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654125 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-sysctl-d\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654146 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/661612b0-cef9-4d3b-ad68-e9507dc62d38-agent-certs\") pod \"konnectivity-agent-xj964\" (UID: \"661612b0-cef9-4d3b-ad68-e9507dc62d38\") " pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654180 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-etc-selinux\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654175 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-socket-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654201 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-multus-socket-dir-parent\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654218 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0921ce28-1383-4534-bfc2-4751014a996a-tmp\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654233 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcwvl\" (UniqueName: \"kubernetes.io/projected/9f4638bd-d27c-475b-9ecd-d0faa1ba55d2-kube-api-access-zcwvl\") pod \"node-resolver-qrk67\" (UID: \"9f4638bd-d27c-475b-9ecd-d0faa1ba55d2\") " pod="openshift-dns/node-resolver-qrk67" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654252 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgjm8\" (UniqueName: \"kubernetes.io/projected/b56ca0f7-5e42-4d61-9c1d-fff86d2affdd-kube-api-access-bgjm8\") pod \"node-ca-7p29d\" (UID: \"b56ca0f7-5e42-4d61-9c1d-fff86d2affdd\") " pod="openshift-image-registry/node-ca-7p29d" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654270 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-modprobe-d\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654247 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-etc-selinux\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654293 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-sysconfig\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654293 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-multus-socket-dir-parent\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654325 2583 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654330 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654354 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-sysconfig\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.654452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654370 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654386 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-device-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654422 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-modprobe-d\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654424 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-system-cni-dir\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654495 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-os-release\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654504 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-system-cni-dir\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654534 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-cni-binary-copy\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654551 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-device-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654564 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j6gw\" (UniqueName: \"kubernetes.io/projected/0921ce28-1383-4534-bfc2-4751014a996a-kube-api-access-8j6gw\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654589 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21690994-61ae-4434-baa1-49a8adf56490-host-slash\") pod \"iptables-alerter-96thx\" (UID: \"21690994-61ae-4434-baa1-49a8adf56490\") " pod="openshift-network-operator/iptables-alerter-96thx" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654614 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-multus-cni-dir\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654636 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-os-release\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654655 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-var-lib-kubelet\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654680 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-os-release\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654683 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21690994-61ae-4434-baa1-49a8adf56490-host-slash\") pod \"iptables-alerter-96thx\" (UID: \"21690994-61ae-4434-baa1-49a8adf56490\") " pod="openshift-network-operator/iptables-alerter-96thx" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654716 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-var-lib-kubelet\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654745 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-kubernetes\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654763 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-os-release\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.655289 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654770 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-sys\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654793 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9f4638bd-d27c-475b-9ecd-d0faa1ba55d2-hosts-file\") pod \"node-resolver-qrk67\" (UID: \"9f4638bd-d27c-475b-9ecd-d0faa1ba55d2\") " pod="openshift-dns/node-resolver-qrk67" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654826 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-multus-cni-dir\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654843 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-sys\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654848 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-kubernetes\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654873 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-kubelet\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654901 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6xsk\" (UniqueName: \"kubernetes.io/projected/c266f31e-39da-4b15-a687-f1304c2e67b7-kube-api-access-s6xsk\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654909 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9f4638bd-d27c-475b-9ecd-d0faa1ba55d2-hosts-file\") pod \"node-resolver-qrk67\" (UID: \"9f4638bd-d27c-475b-9ecd-d0faa1ba55d2\") " pod="openshift-dns/node-resolver-qrk67" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654932 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-node-log\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654959 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-cni-netd\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.654984 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-var-lib-cni-bin\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655013 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-var-lib-cni-multus\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655042 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-cnibin\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655044 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-var-lib-cni-bin\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655068 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-run-netns\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655091 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-ovn-node-metrics-cert\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655096 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-var-lib-cni-multus\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.656069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655111 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-cni-binary-copy\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655136 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-cnibin\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655181 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f625cc6f-1340-411a-b28d-19e397e691de-cni-binary-copy\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655212 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-run\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655237 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/661612b0-cef9-4d3b-ad68-e9507dc62d38-konnectivity-ca\") pod \"konnectivity-agent-xj964\" (UID: \"661612b0-cef9-4d3b-ad68-e9507dc62d38\") " pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655259 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-cnibin\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655280 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-run-k8s-cni-cncf-io\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655315 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-run-k8s-cni-cncf-io\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655328 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-run\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655365 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-hostroot\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655395 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-var-lib-kubelet\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655422 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-systemd\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655451 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkz4p\" (UniqueName: \"kubernetes.io/projected/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-kube-api-access-nkz4p\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655478 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qtn6\" (UniqueName: \"kubernetes.io/projected/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-kube-api-access-7qtn6\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655505 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qn752\" (UniqueName: \"kubernetes.io/projected/21690994-61ae-4434-baa1-49a8adf56490-kube-api-access-qn752\") pod \"iptables-alerter-96thx\" (UID: \"21690994-61ae-4434-baa1-49a8adf56490\") " pod="openshift-network-operator/iptables-alerter-96thx" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655510 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-cnibin\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655536 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.656833 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655562 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-cni-bin\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655573 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-var-lib-kubelet\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655589 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655665 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-hostroot\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655724 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-systemd\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655616 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-registration-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655795 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-registration-dir\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655852 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/661612b0-cef9-4d3b-ad68-e9507dc62d38-konnectivity-ca\") pod \"konnectivity-agent-xj964\" (UID: \"661612b0-cef9-4d3b-ad68-e9507dc62d38\") " pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655866 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:40.655898 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655903 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88wxb\" (UniqueName: \"kubernetes.io/projected/047f2672-ea3f-4aac-a645-3bbf7fa9342c-kube-api-access-88wxb\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.655934 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f625cc6f-1340-411a-b28d-19e397e691de-multus-daemon-config\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:40.655966 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs podName:c266f31e-39da-4b15-a687-f1304c2e67b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:41.155946144 +0000 UTC m=+3.033215312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs") pod "network-metrics-daemon-xk988" (UID: "c266f31e-39da-4b15-a687-f1304c2e67b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656141 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-system-cni-dir\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656194 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656227 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.657413 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656257 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-systemd-units\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656263 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-system-cni-dir\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656283 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-sys-fs\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656309 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-run-multus-certs\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656334 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-sysctl-conf\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656360 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-etc-openvswitch\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656386 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-log-socket\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656410 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-run-ovn-kubernetes\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656436 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656460 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-env-overrides\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656486 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f4638bd-d27c-475b-9ecd-d0faa1ba55d2-tmp-dir\") pod \"node-resolver-qrk67\" (UID: \"9f4638bd-d27c-475b-9ecd-d0faa1ba55d2\") " pod="openshift-dns/node-resolver-qrk67" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656514 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-ovnkube-script-lib\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656538 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b56ca0f7-5e42-4d61-9c1d-fff86d2affdd-serviceca\") pod \"node-ca-7p29d\" (UID: \"b56ca0f7-5e42-4d61-9c1d-fff86d2affdd\") " pod="openshift-image-registry/node-ca-7p29d" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656564 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/21690994-61ae-4434-baa1-49a8adf56490-iptables-alerter-script\") pod \"iptables-alerter-96thx\" (UID: \"21690994-61ae-4434-baa1-49a8adf56490\") " pod="openshift-network-operator/iptables-alerter-96thx" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656604 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-etc-kubernetes\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656638 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f625cc6f-1340-411a-b28d-19e397e691de-cni-binary-copy\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656651 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-lib-modules\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.658113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656715 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f625cc6f-1340-411a-b28d-19e397e691de-multus-daemon-config\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656722 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656743 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-lib-modules\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656821 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/047f2672-ea3f-4aac-a645-3bbf7fa9342c-sys-fs\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656830 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0921ce28-1383-4534-bfc2-4751014a996a-etc-sysctl-conf\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656861 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-host-run-multus-certs\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.656903 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f625cc6f-1340-411a-b28d-19e397e691de-etc-kubernetes\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.657055 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f4638bd-d27c-475b-9ecd-d0faa1ba55d2-tmp-dir\") pod \"node-resolver-qrk67\" (UID: \"9f4638bd-d27c-475b-9ecd-d0faa1ba55d2\") " pod="openshift-dns/node-resolver-qrk67" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.657358 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/21690994-61ae-4434-baa1-49a8adf56490-iptables-alerter-script\") pod \"iptables-alerter-96thx\" (UID: \"21690994-61ae-4434-baa1-49a8adf56490\") " pod="openshift-network-operator/iptables-alerter-96thx" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.657510 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.657937 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0921ce28-1383-4534-bfc2-4751014a996a-etc-tuned\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.658743 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/661612b0-cef9-4d3b-ad68-e9507dc62d38-agent-certs\") pod \"konnectivity-agent-xj964\" (UID: \"661612b0-cef9-4d3b-ad68-e9507dc62d38\") " pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:40.658927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.658801 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0921ce28-1383-4534-bfc2-4751014a996a-tmp\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.659532 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:40.659517 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:40.659588 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:40.659537 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:40.659588 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:40.659549 2583 projected.go:194] Error preparing data for projected volume kube-api-access-d5c4j for pod openshift-network-diagnostics/network-check-target-p96l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:40.659719 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:40.659610 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j podName:e510245b-cd68-4ee7-8f7d-e72ddbd61118 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:41.159595322 +0000 UTC m=+3.036864469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-d5c4j" (UniqueName: "kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j") pod "network-check-target-p96l2" (UID: "e510245b-cd68-4ee7-8f7d-e72ddbd61118") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:40.661549 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.661526 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27rx8\" (UniqueName: \"kubernetes.io/projected/f625cc6f-1340-411a-b28d-19e397e691de-kube-api-access-27rx8\") pod \"multus-kpz56\" (UID: \"f625cc6f-1340-411a-b28d-19e397e691de\") " pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.662185 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.662171 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcwvl\" (UniqueName: \"kubernetes.io/projected/9f4638bd-d27c-475b-9ecd-d0faa1ba55d2-kube-api-access-zcwvl\") pod \"node-resolver-qrk67\" (UID: \"9f4638bd-d27c-475b-9ecd-d0faa1ba55d2\") " pod="openshift-dns/node-resolver-qrk67" Apr 22 17:53:40.663850 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.663828 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6xsk\" (UniqueName: \"kubernetes.io/projected/c266f31e-39da-4b15-a687-f1304c2e67b7-kube-api-access-s6xsk\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:40.667110 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.667022 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j6gw\" (UniqueName: \"kubernetes.io/projected/0921ce28-1383-4534-bfc2-4751014a996a-kube-api-access-8j6gw\") pod \"tuned-56xcg\" (UID: \"0921ce28-1383-4534-bfc2-4751014a996a\") " pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.673219 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.673200 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn752\" (UniqueName: \"kubernetes.io/projected/21690994-61ae-4434-baa1-49a8adf56490-kube-api-access-qn752\") pod \"iptables-alerter-96thx\" (UID: \"21690994-61ae-4434-baa1-49a8adf56490\") " pod="openshift-network-operator/iptables-alerter-96thx" Apr 22 17:53:40.673561 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.673540 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qtn6\" (UniqueName: \"kubernetes.io/projected/dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4-kube-api-access-7qtn6\") pod \"multus-additional-cni-plugins-vh7jd\" (UID: \"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4\") " pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.673995 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.673979 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88wxb\" (UniqueName: \"kubernetes.io/projected/047f2672-ea3f-4aac-a645-3bbf7fa9342c-kube-api-access-88wxb\") pod \"aws-ebs-csi-driver-node-74dnt\" (UID: \"047f2672-ea3f-4aac-a645-3bbf7fa9342c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.757236 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757205 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-kubelet\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757236 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757244 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-node-log\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757267 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-cni-netd\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757294 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-run-netns\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757328 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-ovn-node-metrics-cert\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757339 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-kubelet\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757364 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkz4p\" (UniqueName: \"kubernetes.io/projected/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-kube-api-access-nkz4p\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757402 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-cni-bin\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757407 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-cni-netd\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757418 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-run-netns\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757454 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-cni-bin\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757463 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-node-log\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757468 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-systemd-units\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757497 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-systemd-units\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757497 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-etc-openvswitch\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.757513 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757523 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-etc-openvswitch\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757528 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-log-socket\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757556 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-log-socket\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757555 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-run-ovn-kubernetes\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757587 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-run-ovn-kubernetes\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757585 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757949 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-env-overrides\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757998 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-ovnkube-script-lib\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758034 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b56ca0f7-5e42-4d61-9c1d-fff86d2affdd-serviceca\") pod \"node-ca-7p29d\" (UID: \"b56ca0f7-5e42-4d61-9c1d-fff86d2affdd\") " pod="openshift-image-registry/node-ca-7p29d" Apr 22 17:53:40.758141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758073 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-slash\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758152 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-run-systemd\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758196 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-ovnkube-config\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758253 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-var-lib-openvswitch\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758292 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-run-openvswitch\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.757671 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758328 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-run-ovn\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758387 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-run-ovn\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758895 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758561 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b56ca0f7-5e42-4d61-9c1d-fff86d2affdd-serviceca\") pod \"node-ca-7p29d\" (UID: \"b56ca0f7-5e42-4d61-9c1d-fff86d2affdd\") " pod="openshift-image-registry/node-ca-7p29d" Apr 22 17:53:40.758895 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758669 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-env-overrides\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758895 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758699 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b56ca0f7-5e42-4d61-9c1d-fff86d2affdd-host\") pod \"node-ca-7p29d\" (UID: \"b56ca0f7-5e42-4d61-9c1d-fff86d2affdd\") " pod="openshift-image-registry/node-ca-7p29d" Apr 22 17:53:40.758895 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758761 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgjm8\" (UniqueName: \"kubernetes.io/projected/b56ca0f7-5e42-4d61-9c1d-fff86d2affdd-kube-api-access-bgjm8\") pod \"node-ca-7p29d\" (UID: \"b56ca0f7-5e42-4d61-9c1d-fff86d2affdd\") " pod="openshift-image-registry/node-ca-7p29d" Apr 22 17:53:40.758895 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758777 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-var-lib-openvswitch\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758895 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758826 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-run-openvswitch\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.758895 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758830 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b56ca0f7-5e42-4d61-9c1d-fff86d2affdd-host\") pod \"node-ca-7p29d\" (UID: \"b56ca0f7-5e42-4d61-9c1d-fff86d2affdd\") " pod="openshift-image-registry/node-ca-7p29d" Apr 22 17:53:40.758895 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758881 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-host-slash\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.759245 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758927 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-ovnkube-config\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.759245 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.758951 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-run-systemd\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.759245 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.759191 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-ovnkube-script-lib\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.763135 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.763108 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-ovn-node-metrics-cert\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.766317 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.766293 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgjm8\" (UniqueName: \"kubernetes.io/projected/b56ca0f7-5e42-4d61-9c1d-fff86d2affdd-kube-api-access-bgjm8\") pod \"node-ca-7p29d\" (UID: \"b56ca0f7-5e42-4d61-9c1d-fff86d2affdd\") " pod="openshift-image-registry/node-ca-7p29d" Apr 22 17:53:40.766406 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.766299 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkz4p\" (UniqueName: \"kubernetes.io/projected/44a6788e-ee8d-4d8f-9255-fc53fdbd083f-kube-api-access-nkz4p\") pod \"ovnkube-node-t6kbb\" (UID: \"44a6788e-ee8d-4d8f-9255-fc53fdbd083f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.844463 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.844334 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:40.853926 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.853898 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" Apr 22 17:53:40.860509 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.860488 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-56xcg" Apr 22 17:53:40.866062 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.866042 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-96thx" Apr 22 17:53:40.870401 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.870384 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:40.872550 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.872533 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kpz56" Apr 22 17:53:40.879097 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.879075 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qrk67" Apr 22 17:53:40.885696 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.885677 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" Apr 22 17:53:40.886958 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.886941 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:40.892215 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.892192 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:53:40.897760 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:40.897731 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7p29d" Apr 22 17:53:41.161465 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.161392 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:41.161465 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.161443 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c4j\" (UniqueName: \"kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j\") pod \"network-check-target-p96l2\" (UID: \"e510245b-cd68-4ee7-8f7d-e72ddbd61118\") " pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:41.161645 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:41.161556 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:41.161645 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:41.161568 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:41.161645 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:41.161582 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:41.161645 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:41.161593 2583 projected.go:194] Error preparing data for projected volume kube-api-access-d5c4j for pod openshift-network-diagnostics/network-check-target-p96l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:41.161645 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:41.161637 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs podName:c266f31e-39da-4b15-a687-f1304c2e67b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:42.161603351 +0000 UTC m=+4.038872514 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs") pod "network-metrics-daemon-xk988" (UID: "c266f31e-39da-4b15-a687-f1304c2e67b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:41.161817 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:41.161655 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j podName:e510245b-cd68-4ee7-8f7d-e72ddbd61118 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:42.161645886 +0000 UTC m=+4.038915032 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-d5c4j" (UniqueName: "kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j") pod "network-check-target-p96l2" (UID: "e510245b-cd68-4ee7-8f7d-e72ddbd61118") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:41.206444 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:41.206414 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd5d086e_cf48_4cf7_91c6_f3ecfb0513a4.slice/crio-84a1bc5c6ea9e944156bb28f1d0539d989aef69d0b8a2b55d0a705a658724ae2 WatchSource:0}: Error finding container 84a1bc5c6ea9e944156bb28f1d0539d989aef69d0b8a2b55d0a705a658724ae2: Status 404 returned error can't find the container with id 84a1bc5c6ea9e944156bb28f1d0539d989aef69d0b8a2b55d0a705a658724ae2 Apr 22 17:53:41.209142 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:41.209101 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf625cc6f_1340_411a_b28d_19e397e691de.slice/crio-a8e0c7800aaec7e1ce2360bb57791374b792673ba8c0ec6cecada658e739fdcd WatchSource:0}: Error finding container a8e0c7800aaec7e1ce2360bb57791374b792673ba8c0ec6cecada658e739fdcd: Status 404 returned error can't find the container with id a8e0c7800aaec7e1ce2360bb57791374b792673ba8c0ec6cecada658e739fdcd Apr 22 17:53:41.211370 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:41.211289 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f4638bd_d27c_475b_9ecd_d0faa1ba55d2.slice/crio-1d28481f9cec381eab9d1f65783dfcb0f568fead746102a884e8d45c6809e7f6 WatchSource:0}: Error finding container 1d28481f9cec381eab9d1f65783dfcb0f568fead746102a884e8d45c6809e7f6: Status 404 returned error can't find the container with id 1d28481f9cec381eab9d1f65783dfcb0f568fead746102a884e8d45c6809e7f6 Apr 22 17:53:41.212219 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:41.212173 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44a6788e_ee8d_4d8f_9255_fc53fdbd083f.slice/crio-5889631fa72656d90fdb757ee76be305f14451a5a4f9686280d09abef72520e9 WatchSource:0}: Error finding container 5889631fa72656d90fdb757ee76be305f14451a5a4f9686280d09abef72520e9: Status 404 returned error can't find the container with id 5889631fa72656d90fdb757ee76be305f14451a5a4f9686280d09abef72520e9 Apr 22 17:53:41.213057 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:41.212998 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb56ca0f7_5e42_4d61_9c1d_fff86d2affdd.slice/crio-4a527a23f29eb0940c3eada4fc662a454173aeabcee7c7204929a65f7b9285bb WatchSource:0}: Error finding container 4a527a23f29eb0940c3eada4fc662a454173aeabcee7c7204929a65f7b9285bb: Status 404 returned error can't find the container with id 4a527a23f29eb0940c3eada4fc662a454173aeabcee7c7204929a65f7b9285bb Apr 22 17:53:41.216490 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:53:41.216285 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21690994_61ae_4434_baa1_49a8adf56490.slice/crio-25962995a781a668cde50169f34f196dd5902405e3461d892fbdff6a9d1b97c9 WatchSource:0}: Error finding container 25962995a781a668cde50169f34f196dd5902405e3461d892fbdff6a9d1b97c9: Status 404 returned error can't find the container with id 25962995a781a668cde50169f34f196dd5902405e3461d892fbdff6a9d1b97c9 Apr 22 17:53:41.589540 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.589453 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:39 +0000 UTC" deadline="2028-01-04 13:40:14.845607118 +0000 UTC" Apr 22 17:53:41.589540 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.589490 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14923h46m33.256121244s" Apr 22 17:53:41.680442 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.680381 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" event={"ID":"047f2672-ea3f-4aac-a645-3bbf7fa9342c","Type":"ContainerStarted","Data":"be2ba01f22ffc7140b3b88e3d4121df5bceb0cf68d518f58636b3b8324b56c22"} Apr 22 17:53:41.686113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.686081 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xj964" event={"ID":"661612b0-cef9-4d3b-ad68-e9507dc62d38","Type":"ContainerStarted","Data":"320adef2d5a9d7ef7702d68395b8417acf8e57dbc5102ca9febaba06facc8fb3"} Apr 22 17:53:41.690878 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.690849 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-96thx" event={"ID":"21690994-61ae-4434-baa1-49a8adf56490","Type":"ContainerStarted","Data":"25962995a781a668cde50169f34f196dd5902405e3461d892fbdff6a9d1b97c9"} Apr 22 17:53:41.696500 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.696455 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qrk67" event={"ID":"9f4638bd-d27c-475b-9ecd-d0faa1ba55d2","Type":"ContainerStarted","Data":"1d28481f9cec381eab9d1f65783dfcb0f568fead746102a884e8d45c6809e7f6"} Apr 22 17:53:41.699490 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.699459 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kpz56" event={"ID":"f625cc6f-1340-411a-b28d-19e397e691de","Type":"ContainerStarted","Data":"a8e0c7800aaec7e1ce2360bb57791374b792673ba8c0ec6cecada658e739fdcd"} Apr 22 17:53:41.705836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.704901 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" event={"ID":"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4","Type":"ContainerStarted","Data":"84a1bc5c6ea9e944156bb28f1d0539d989aef69d0b8a2b55d0a705a658724ae2"} Apr 22 17:53:41.709041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.709011 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-69.ec2.internal" event={"ID":"5240e71df4ae79d7d950c1a36bd685b5","Type":"ContainerStarted","Data":"1518411f13076861c18c4aa0145283dc8a1a663deac61434c06fe2d7ff10a9e4"} Apr 22 17:53:41.720073 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.720044 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" event={"ID":"44a6788e-ee8d-4d8f-9255-fc53fdbd083f","Type":"ContainerStarted","Data":"5889631fa72656d90fdb757ee76be305f14451a5a4f9686280d09abef72520e9"} Apr 22 17:53:41.726916 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.726867 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-69.ec2.internal" podStartSLOduration=2.726851503 podStartE2EDuration="2.726851503s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:41.726150264 +0000 UTC m=+3.603419432" watchObservedRunningTime="2026-04-22 17:53:41.726851503 +0000 UTC m=+3.604120675" Apr 22 17:53:41.732008 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.731979 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-56xcg" event={"ID":"0921ce28-1383-4534-bfc2-4751014a996a","Type":"ContainerStarted","Data":"66a2d22bfa4453efc3d04d7927566a06216d0d364100bb0d416b9a713ecc475c"} Apr 22 17:53:41.746643 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:41.744831 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7p29d" event={"ID":"b56ca0f7-5e42-4d61-9c1d-fff86d2affdd","Type":"ContainerStarted","Data":"4a527a23f29eb0940c3eada4fc662a454173aeabcee7c7204929a65f7b9285bb"} Apr 22 17:53:42.169986 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:42.169948 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c4j\" (UniqueName: \"kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j\") pod \"network-check-target-p96l2\" (UID: \"e510245b-cd68-4ee7-8f7d-e72ddbd61118\") " pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:42.170174 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:42.170042 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:42.170174 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:42.170145 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:42.170174 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:42.170170 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:42.170330 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:42.170231 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs podName:c266f31e-39da-4b15-a687-f1304c2e67b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:44.170213058 +0000 UTC m=+6.047482222 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs") pod "network-metrics-daemon-xk988" (UID: "c266f31e-39da-4b15-a687-f1304c2e67b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:42.170330 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:42.170173 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:42.170330 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:42.170272 2583 projected.go:194] Error preparing data for projected volume kube-api-access-d5c4j for pod openshift-network-diagnostics/network-check-target-p96l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:42.170472 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:42.170344 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j podName:e510245b-cd68-4ee7-8f7d-e72ddbd61118 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:44.170314094 +0000 UTC m=+6.047583252 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-d5c4j" (UniqueName: "kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j") pod "network-check-target-p96l2" (UID: "e510245b-cd68-4ee7-8f7d-e72ddbd61118") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:42.663426 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:42.663390 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:42.663934 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:42.663531 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:53:42.663998 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:42.663973 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:42.664196 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:42.664061 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:53:42.769974 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:42.769935 2583 generic.go:358] "Generic (PLEG): container finished" podID="5997c5e5e00da4f1b4bc8591f471ca46" containerID="69b0bf5962f927e9244613bd7a602b6a70f9d48f395215784fdada6e54d4a8bb" exitCode=0 Apr 22 17:53:42.770133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:42.770112 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" event={"ID":"5997c5e5e00da4f1b4bc8591f471ca46","Type":"ContainerDied","Data":"69b0bf5962f927e9244613bd7a602b6a70f9d48f395215784fdada6e54d4a8bb"} Apr 22 17:53:43.776595 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:43.775784 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" event={"ID":"5997c5e5e00da4f1b4bc8591f471ca46","Type":"ContainerStarted","Data":"64610931b75263933d080cff968497c94129812222337a5e96702549cbd50a63"} Apr 22 17:53:44.189464 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:44.188392 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:44.189464 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:44.188467 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c4j\" (UniqueName: \"kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j\") pod \"network-check-target-p96l2\" (UID: \"e510245b-cd68-4ee7-8f7d-e72ddbd61118\") " pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:44.189464 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:44.188676 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:44.189464 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:44.188699 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:44.189464 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:44.188714 2583 projected.go:194] Error preparing data for projected volume kube-api-access-d5c4j for pod openshift-network-diagnostics/network-check-target-p96l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:44.189464 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:44.188778 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j podName:e510245b-cd68-4ee7-8f7d-e72ddbd61118 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:48.188758857 +0000 UTC m=+10.066028020 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-d5c4j" (UniqueName: "kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j") pod "network-check-target-p96l2" (UID: "e510245b-cd68-4ee7-8f7d-e72ddbd61118") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:44.189464 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:44.189211 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:44.189464 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:44.189292 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs podName:c266f31e-39da-4b15-a687-f1304c2e67b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:48.189274536 +0000 UTC m=+10.066543699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs") pod "network-metrics-daemon-xk988" (UID: "c266f31e-39da-4b15-a687-f1304c2e67b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:44.662153 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:44.662062 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:44.662302 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:44.662218 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:53:44.662589 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:44.662571 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:44.662706 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:44.662685 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:53:46.662153 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:46.662114 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:46.662620 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:46.662114 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:46.662620 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:46.662274 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:53:46.662620 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:46.662323 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:53:48.220641 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:48.220584 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:48.221078 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:48.220670 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c4j\" (UniqueName: \"kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j\") pod \"network-check-target-p96l2\" (UID: \"e510245b-cd68-4ee7-8f7d-e72ddbd61118\") " pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:48.221078 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:48.220755 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:48.221078 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:48.220818 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:48.221078 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:48.220834 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs podName:c266f31e-39da-4b15-a687-f1304c2e67b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:56.220814134 +0000 UTC m=+18.098083296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs") pod "network-metrics-daemon-xk988" (UID: "c266f31e-39da-4b15-a687-f1304c2e67b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:48.221078 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:48.220838 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:48.221078 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:48.220853 2583 projected.go:194] Error preparing data for projected volume kube-api-access-d5c4j for pod openshift-network-diagnostics/network-check-target-p96l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:48.221078 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:48.220902 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j podName:e510245b-cd68-4ee7-8f7d-e72ddbd61118 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:56.220885697 +0000 UTC m=+18.098154857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-d5c4j" (UniqueName: "kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j") pod "network-check-target-p96l2" (UID: "e510245b-cd68-4ee7-8f7d-e72ddbd61118") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:48.661864 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:48.661784 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:48.663028 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:48.662971 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:53:48.663171 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:48.663042 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:48.663171 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:48.663148 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:53:50.661814 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:50.661722 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:50.661814 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:50.661772 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:50.662284 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:50.661888 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:53:50.662284 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:50.661999 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:53:52.661322 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:52.661286 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:52.661854 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:52.661289 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:52.661854 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:52.661414 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:53:52.661854 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:52.661506 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:53:54.661507 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:54.661467 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:54.661507 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:54.661489 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:54.661973 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:54.661603 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:53:54.661973 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:54.661736 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:53:56.275560 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:56.275519 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:56.276029 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:56.275588 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c4j\" (UniqueName: \"kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j\") pod \"network-check-target-p96l2\" (UID: \"e510245b-cd68-4ee7-8f7d-e72ddbd61118\") " pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:56.276029 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:56.275709 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:56.276029 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:56.275728 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:56.276029 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:56.275755 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:56.276029 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:56.275770 2583 projected.go:194] Error preparing data for projected volume kube-api-access-d5c4j for pod openshift-network-diagnostics/network-check-target-p96l2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:56.276029 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:56.275789 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs podName:c266f31e-39da-4b15-a687-f1304c2e67b7 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:12.275768438 +0000 UTC m=+34.153037593 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs") pod "network-metrics-daemon-xk988" (UID: "c266f31e-39da-4b15-a687-f1304c2e67b7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:56.276029 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:56.275832 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j podName:e510245b-cd68-4ee7-8f7d-e72ddbd61118 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:12.275801554 +0000 UTC m=+34.153070716 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-d5c4j" (UniqueName: "kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j") pod "network-check-target-p96l2" (UID: "e510245b-cd68-4ee7-8f7d-e72ddbd61118") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:56.661437 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:56.661348 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:56.661437 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:56.661398 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:56.661619 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:56.661505 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:53:56.661710 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:56.661640 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:53:58.661650 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.661605 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:53:58.662388 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:58.661738 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:53:58.662388 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.661786 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:53:58.662388 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:53:58.661845 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:53:58.800278 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.800022 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" event={"ID":"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4","Type":"ContainerStarted","Data":"3bfbab4d0ad693bc3e9b7e57fa336eb066d0aa13849c39ad946c76c8efd1246f"} Apr 22 17:53:58.801461 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.801443 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 17:53:58.801798 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.801777 2583 generic.go:358] "Generic (PLEG): container finished" podID="44a6788e-ee8d-4d8f-9255-fc53fdbd083f" containerID="7109cece29c7293f5f391f792bcd653c131ec0fefabbb6b93f7a166403da051a" exitCode=1 Apr 22 17:53:58.801877 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.801850 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" event={"ID":"44a6788e-ee8d-4d8f-9255-fc53fdbd083f","Type":"ContainerDied","Data":"7109cece29c7293f5f391f792bcd653c131ec0fefabbb6b93f7a166403da051a"} Apr 22 17:53:58.801945 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.801892 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" event={"ID":"44a6788e-ee8d-4d8f-9255-fc53fdbd083f","Type":"ContainerStarted","Data":"7e788f872e372b95a7ab9b6dddde11f50fceaf16a1faa7bcfb83bfe141ab1524"} Apr 22 17:53:58.803108 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.803080 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-56xcg" event={"ID":"0921ce28-1383-4534-bfc2-4751014a996a","Type":"ContainerStarted","Data":"5cfd52ff4b494f7bd55adf55a06cb13c392a2a1e0ff0b9fbabc2457b7ef5868c"} Apr 22 17:53:58.804342 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.804308 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7p29d" event={"ID":"b56ca0f7-5e42-4d61-9c1d-fff86d2affdd","Type":"ContainerStarted","Data":"870de9b2241068a013771b9309b89ab7970609a99adc206b4647146061b67546"} Apr 22 17:53:58.805915 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.805894 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" event={"ID":"047f2672-ea3f-4aac-a645-3bbf7fa9342c","Type":"ContainerStarted","Data":"17e211db30e0dd1c4e744993656467df5b5ea823838c6830426c983fa11e6b9b"} Apr 22 17:53:58.807231 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.807208 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xj964" event={"ID":"661612b0-cef9-4d3b-ad68-e9507dc62d38","Type":"ContainerStarted","Data":"a418a14554b2c52b782839c2e1f54addf243810738d04090f8f0bd62d89690f6"} Apr 22 17:53:58.808569 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.808549 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qrk67" event={"ID":"9f4638bd-d27c-475b-9ecd-d0faa1ba55d2","Type":"ContainerStarted","Data":"fbbd0eb497b6ab0177273f14ff185f84e7cbe4454ce3912de12d237283e39565"} Apr 22 17:53:58.809979 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.809953 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kpz56" event={"ID":"f625cc6f-1340-411a-b28d-19e397e691de","Type":"ContainerStarted","Data":"39d5bb048b1fd31246f101aa68f0aa3674a19d691233c2ad3f4d792e6cc73233"} Apr 22 17:53:58.824687 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.824523 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-69.ec2.internal" podStartSLOduration=19.824507361 podStartE2EDuration="19.824507361s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:43.795756226 +0000 UTC m=+5.673025398" watchObservedRunningTime="2026-04-22 17:53:58.824507361 +0000 UTC m=+20.701776554" Apr 22 17:53:58.840126 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.840089 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qrk67" podStartSLOduration=3.763032074 podStartE2EDuration="20.840076207s" podCreationTimestamp="2026-04-22 17:53:38 +0000 UTC" firstStartedPulling="2026-04-22 17:53:41.213242949 +0000 UTC m=+3.090512101" lastFinishedPulling="2026-04-22 17:53:58.290287082 +0000 UTC m=+20.167556234" observedRunningTime="2026-04-22 17:53:58.839905812 +0000 UTC m=+20.717174979" watchObservedRunningTime="2026-04-22 17:53:58.840076207 +0000 UTC m=+20.717345375" Apr 22 17:53:58.855529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.855478 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xj964" podStartSLOduration=3.7843788160000003 podStartE2EDuration="20.855463339s" podCreationTimestamp="2026-04-22 17:53:38 +0000 UTC" firstStartedPulling="2026-04-22 17:53:41.219208396 +0000 UTC m=+3.096477552" lastFinishedPulling="2026-04-22 17:53:58.290292926 +0000 UTC m=+20.167562075" observedRunningTime="2026-04-22 17:53:58.854965112 +0000 UTC m=+20.732234291" watchObservedRunningTime="2026-04-22 17:53:58.855463339 +0000 UTC m=+20.732732510" Apr 22 17:53:58.871611 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.871567 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-56xcg" podStartSLOduration=3.823251547 podStartE2EDuration="20.871554676s" podCreationTimestamp="2026-04-22 17:53:38 +0000 UTC" firstStartedPulling="2026-04-22 17:53:41.219512749 +0000 UTC m=+3.096781912" lastFinishedPulling="2026-04-22 17:53:58.267815886 +0000 UTC m=+20.145085041" observedRunningTime="2026-04-22 17:53:58.871109969 +0000 UTC m=+20.748379139" watchObservedRunningTime="2026-04-22 17:53:58.871554676 +0000 UTC m=+20.748823844" Apr 22 17:53:58.887662 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:58.887551 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kpz56" podStartSLOduration=3.772259988 podStartE2EDuration="20.887534242s" podCreationTimestamp="2026-04-22 17:53:38 +0000 UTC" firstStartedPulling="2026-04-22 17:53:41.211000102 +0000 UTC m=+3.088269251" lastFinishedPulling="2026-04-22 17:53:58.326274357 +0000 UTC m=+20.203543505" observedRunningTime="2026-04-22 17:53:58.886962553 +0000 UTC m=+20.764231744" watchObservedRunningTime="2026-04-22 17:53:58.887534242 +0000 UTC m=+20.764803411" Apr 22 17:53:59.270911 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.270878 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:59.271614 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.271590 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:59.303563 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.303484 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7p29d" podStartSLOduration=3.23196061 podStartE2EDuration="20.303470727s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:41.219197377 +0000 UTC m=+3.096466537" lastFinishedPulling="2026-04-22 17:53:58.2907075 +0000 UTC m=+20.167976654" observedRunningTime="2026-04-22 17:53:58.900752993 +0000 UTC m=+20.778022174" watchObservedRunningTime="2026-04-22 17:53:59.303470727 +0000 UTC m=+21.180739895" Apr 22 17:53:59.813476 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.813450 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 17:53:59.813913 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.813820 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" event={"ID":"44a6788e-ee8d-4d8f-9255-fc53fdbd083f","Type":"ContainerStarted","Data":"32e53600913fc17fded59813b778ab88c58d659f12871e17df07427077d95fd3"} Apr 22 17:53:59.813913 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.813849 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" event={"ID":"44a6788e-ee8d-4d8f-9255-fc53fdbd083f","Type":"ContainerStarted","Data":"5cac6136b0d42ead7cd0841193dac7955e419f160b32f63c3593cb38ece783ab"} Apr 22 17:53:59.813913 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.813864 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" event={"ID":"44a6788e-ee8d-4d8f-9255-fc53fdbd083f","Type":"ContainerStarted","Data":"2c90488cc70c220a7c5c899a77c233500d11af3fef46ebae6cb0443e7ad0794a"} Apr 22 17:53:59.813913 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.813873 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" event={"ID":"44a6788e-ee8d-4d8f-9255-fc53fdbd083f","Type":"ContainerStarted","Data":"9ba73f7b6c54a5cf0e981334e6776be7e66020140847a1ae7fd98de42f779c9d"} Apr 22 17:53:59.815107 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.815081 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-96thx" event={"ID":"21690994-61ae-4434-baa1-49a8adf56490","Type":"ContainerStarted","Data":"7b7242203d354fea3de94136aedf3fb9fbc047b4bcdf82ba9b2e1402ae2b9edb"} Apr 22 17:53:59.816407 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.816382 2583 generic.go:358] "Generic (PLEG): container finished" podID="dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4" containerID="3bfbab4d0ad693bc3e9b7e57fa336eb066d0aa13849c39ad946c76c8efd1246f" exitCode=0 Apr 22 17:53:59.816537 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.816509 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" event={"ID":"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4","Type":"ContainerDied","Data":"3bfbab4d0ad693bc3e9b7e57fa336eb066d0aa13849c39ad946c76c8efd1246f"} Apr 22 17:53:59.816737 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.816718 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:59.817395 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.817302 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xj964" Apr 22 17:53:59.853029 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:53:59.852996 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-96thx" podStartSLOduration=4.7805785 podStartE2EDuration="21.852985371s" podCreationTimestamp="2026-04-22 17:53:38 +0000 UTC" firstStartedPulling="2026-04-22 17:53:41.21830322 +0000 UTC m=+3.095572369" lastFinishedPulling="2026-04-22 17:53:58.29071008 +0000 UTC m=+20.167979240" observedRunningTime="2026-04-22 17:53:59.829780285 +0000 UTC m=+21.707049450" watchObservedRunningTime="2026-04-22 17:53:59.852985371 +0000 UTC m=+21.730254598" Apr 22 17:54:00.144925 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:00.144788 2583 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:54:00.607296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:00.607183 2583 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:54:00.144920897Z","UUID":"233800f3-170e-41f7-9479-1ad03090ea0a","Handler":null,"Name":"","Endpoint":""} Apr 22 17:54:00.611022 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:00.610993 2583 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:54:00.611164 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:00.611031 2583 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:54:00.661603 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:00.661569 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:00.661603 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:00.661590 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:54:00.661829 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:00.661685 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:54:00.661890 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:00.661823 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:54:00.821668 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:00.821199 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" event={"ID":"047f2672-ea3f-4aac-a645-3bbf7fa9342c","Type":"ContainerStarted","Data":"6ebef19d83fd0da19c3280363d53583460a8cf5e771c6c9c3f3a726060ab4d64"} Apr 22 17:54:01.825526 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:01.825445 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 17:54:01.826186 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:01.825851 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" event={"ID":"44a6788e-ee8d-4d8f-9255-fc53fdbd083f","Type":"ContainerStarted","Data":"5b36f0170f7cbca5f77a20b67b649214120a8a4636913953ec5ceb489cf5cb20"} Apr 22 17:54:01.827721 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:01.827688 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" event={"ID":"047f2672-ea3f-4aac-a645-3bbf7fa9342c","Type":"ContainerStarted","Data":"5d10dab2682fbd5e51a62ed7fc65fdc291f90f7123fb5d64a962d1bfdae1f4ad"} Apr 22 17:54:01.853676 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:01.853618 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74dnt" podStartSLOduration=3.764256381 podStartE2EDuration="23.853604684s" podCreationTimestamp="2026-04-22 17:53:38 +0000 UTC" firstStartedPulling="2026-04-22 17:53:41.220532534 +0000 UTC m=+3.097801690" lastFinishedPulling="2026-04-22 17:54:01.309880847 +0000 UTC m=+23.187149993" observedRunningTime="2026-04-22 17:54:01.852402326 +0000 UTC m=+23.729671494" watchObservedRunningTime="2026-04-22 17:54:01.853604684 +0000 UTC m=+23.730873846" Apr 22 17:54:02.661106 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:02.661065 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:54:02.661279 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:02.661204 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:54:02.661279 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:02.661236 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:02.661363 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:02.661300 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:54:03.833540 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:03.833346 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" event={"ID":"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4","Type":"ContainerStarted","Data":"c3789ea6956952bb0142b971f06f6244c20f03032c93510354ac9259e5089d51"} Apr 22 17:54:03.837069 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:03.837042 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 17:54:03.837359 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:03.837336 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" event={"ID":"44a6788e-ee8d-4d8f-9255-fc53fdbd083f","Type":"ContainerStarted","Data":"5a16332df80f3260fe6f2617d805a21c3547e04d90e7e3fd73e41037860fdf65"} Apr 22 17:54:03.837981 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:03.837851 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:54:03.837981 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:03.837881 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:54:03.837981 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:03.837915 2583 scope.go:117] "RemoveContainer" containerID="7109cece29c7293f5f391f792bcd653c131ec0fefabbb6b93f7a166403da051a" Apr 22 17:54:03.838184 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:03.838061 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:54:03.855574 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:03.855541 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:54:03.856312 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:03.856291 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:54:04.661699 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:04.661673 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:54:04.661818 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:04.661732 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:04.661818 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:04.661773 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:54:04.661818 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:04.661803 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:54:04.841459 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:04.841430 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 17:54:04.841908 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:04.841779 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" event={"ID":"44a6788e-ee8d-4d8f-9255-fc53fdbd083f","Type":"ContainerStarted","Data":"d15bc69fad50fe99f277b766a9db4aaa456e4f4be1c45e8a65e80d4df46d583d"} Apr 22 17:54:04.843170 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:04.843148 2583 generic.go:358] "Generic (PLEG): container finished" podID="dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4" containerID="c3789ea6956952bb0142b971f06f6244c20f03032c93510354ac9259e5089d51" exitCode=0 Apr 22 17:54:04.843315 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:04.843181 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" event={"ID":"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4","Type":"ContainerDied","Data":"c3789ea6956952bb0142b971f06f6244c20f03032c93510354ac9259e5089d51"} Apr 22 17:54:04.898010 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:04.897950 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" podStartSLOduration=8.746496052 podStartE2EDuration="25.897929246s" podCreationTimestamp="2026-04-22 17:53:39 +0000 UTC" firstStartedPulling="2026-04-22 17:53:41.214425793 +0000 UTC m=+3.091694943" lastFinishedPulling="2026-04-22 17:53:58.365858988 +0000 UTC m=+20.243128137" observedRunningTime="2026-04-22 17:54:04.873874983 +0000 UTC m=+26.751144152" watchObservedRunningTime="2026-04-22 17:54:04.897929246 +0000 UTC m=+26.775198416" Apr 22 17:54:05.635431 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:05.635183 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-p96l2"] Apr 22 17:54:05.635557 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:05.635520 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:05.635647 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:05.635610 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:54:05.641153 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:05.641130 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xk988"] Apr 22 17:54:05.641260 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:05.641210 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:54:05.641325 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:05.641314 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:54:05.849972 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:05.849900 2583 generic.go:358] "Generic (PLEG): container finished" podID="dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4" containerID="e71050748e35889a07aad0e3f3172dfc8938bfe9d00322186111a2712811b3b7" exitCode=0 Apr 22 17:54:05.850322 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:05.849980 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" event={"ID":"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4","Type":"ContainerDied","Data":"e71050748e35889a07aad0e3f3172dfc8938bfe9d00322186111a2712811b3b7"} Apr 22 17:54:06.854526 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:06.854431 2583 generic.go:358] "Generic (PLEG): container finished" podID="dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4" containerID="256747e913d0813f3b1fdd65024b90bb949380b8b76c0dac7332e8862b4fb882" exitCode=0 Apr 22 17:54:06.854526 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:06.854495 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" event={"ID":"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4","Type":"ContainerDied","Data":"256747e913d0813f3b1fdd65024b90bb949380b8b76c0dac7332e8862b4fb882"} Apr 22 17:54:07.661084 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:07.661058 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:07.661217 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:07.661158 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:54:07.661263 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:07.661065 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:54:07.661324 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:07.661307 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:54:09.661334 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:09.661293 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:09.661953 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:09.661297 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:54:09.661953 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:09.661450 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p96l2" podUID="e510245b-cd68-4ee7-8f7d-e72ddbd61118" Apr 22 17:54:09.661953 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:09.661535 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xk988" podUID="c266f31e-39da-4b15-a687-f1304c2e67b7" Apr 22 17:54:11.455401 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.455320 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-69.ec2.internal" event="NodeReady" Apr 22 17:54:11.456023 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.455473 2583 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:54:11.504376 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.504348 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pbfbf"] Apr 22 17:54:11.508698 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.508651 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dqmzc"] Apr 22 17:54:11.508832 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.508808 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.511464 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.511445 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:54:11.511583 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.511468 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dqmzc" Apr 22 17:54:11.511660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.511446 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:54:11.511952 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.511932 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rhqnx\"" Apr 22 17:54:11.515379 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.514262 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dchxj\"" Apr 22 17:54:11.515379 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.514235 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:54:11.515379 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.514474 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:54:11.515379 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.514688 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:54:11.518214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.517694 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pbfbf"] Apr 22 17:54:11.520795 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.520738 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dqmzc"] Apr 22 17:54:11.590456 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.590414 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db6e7b89-b68b-4920-9aaa-d35998aee879-config-volume\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.590645 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.590531 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2b78a02-ed41-4b77-adce-5c8606af896b-cert\") pod \"ingress-canary-dqmzc\" (UID: \"e2b78a02-ed41-4b77-adce-5c8606af896b\") " pod="openshift-ingress-canary/ingress-canary-dqmzc" Apr 22 17:54:11.590645 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.590576 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/db6e7b89-b68b-4920-9aaa-d35998aee879-tmp-dir\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.590770 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.590702 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db6e7b89-b68b-4920-9aaa-d35998aee879-metrics-tls\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.590770 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.590757 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqsx8\" (UniqueName: \"kubernetes.io/projected/e2b78a02-ed41-4b77-adce-5c8606af896b-kube-api-access-nqsx8\") pod \"ingress-canary-dqmzc\" (UID: \"e2b78a02-ed41-4b77-adce-5c8606af896b\") " pod="openshift-ingress-canary/ingress-canary-dqmzc" Apr 22 17:54:11.590878 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.590793 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkv4t\" (UniqueName: \"kubernetes.io/projected/db6e7b89-b68b-4920-9aaa-d35998aee879-kube-api-access-jkv4t\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.606540 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.606510 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5skr5"] Apr 22 17:54:11.609774 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.609756 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.612379 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.612357 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:54:11.612977 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.612919 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:54:11.612977 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.612972 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:54:11.612977 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.612985 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dqcf6\"" Apr 22 17:54:11.613191 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.612984 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:54:11.618964 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.618943 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5skr5"] Apr 22 17:54:11.661042 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.661015 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:11.661042 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.661033 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:54:11.663674 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.663653 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:54:11.663792 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.663748 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:54:11.664160 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.664139 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c9cn8\"" Apr 22 17:54:11.664253 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.664219 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:54:11.664253 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.664146 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7sslk\"" Apr 22 17:54:11.691859 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.691833 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db6e7b89-b68b-4920-9aaa-d35998aee879-config-volume\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.691974 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.691876 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-data-volume\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.691974 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.691905 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.692148 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.691996 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2b78a02-ed41-4b77-adce-5c8606af896b-cert\") pod \"ingress-canary-dqmzc\" (UID: \"e2b78a02-ed41-4b77-adce-5c8606af896b\") " pod="openshift-ingress-canary/ingress-canary-dqmzc" Apr 22 17:54:11.692148 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.692040 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/db6e7b89-b68b-4920-9aaa-d35998aee879-tmp-dir\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.692148 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.692075 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-crio-socket\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.692148 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.692118 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.692334 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.692192 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db6e7b89-b68b-4920-9aaa-d35998aee879-metrics-tls\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.692334 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.692227 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mqq2\" (UniqueName: \"kubernetes.io/projected/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-kube-api-access-2mqq2\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.692334 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.692273 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqsx8\" (UniqueName: \"kubernetes.io/projected/e2b78a02-ed41-4b77-adce-5c8606af896b-kube-api-access-nqsx8\") pod \"ingress-canary-dqmzc\" (UID: \"e2b78a02-ed41-4b77-adce-5c8606af896b\") " pod="openshift-ingress-canary/ingress-canary-dqmzc" Apr 22 17:54:11.692334 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.692303 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkv4t\" (UniqueName: \"kubernetes.io/projected/db6e7b89-b68b-4920-9aaa-d35998aee879-kube-api-access-jkv4t\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.692497 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.692436 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/db6e7b89-b68b-4920-9aaa-d35998aee879-tmp-dir\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.692497 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.692460 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/db6e7b89-b68b-4920-9aaa-d35998aee879-config-volume\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.696520 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.696477 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/db6e7b89-b68b-4920-9aaa-d35998aee879-metrics-tls\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.696732 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.696712 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2b78a02-ed41-4b77-adce-5c8606af896b-cert\") pod \"ingress-canary-dqmzc\" (UID: \"e2b78a02-ed41-4b77-adce-5c8606af896b\") " pod="openshift-ingress-canary/ingress-canary-dqmzc" Apr 22 17:54:11.700723 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.700697 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkv4t\" (UniqueName: \"kubernetes.io/projected/db6e7b89-b68b-4920-9aaa-d35998aee879-kube-api-access-jkv4t\") pod \"dns-default-pbfbf\" (UID: \"db6e7b89-b68b-4920-9aaa-d35998aee879\") " pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.700984 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.700964 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqsx8\" (UniqueName: \"kubernetes.io/projected/e2b78a02-ed41-4b77-adce-5c8606af896b-kube-api-access-nqsx8\") pod \"ingress-canary-dqmzc\" (UID: \"e2b78a02-ed41-4b77-adce-5c8606af896b\") " pod="openshift-ingress-canary/ingress-canary-dqmzc" Apr 22 17:54:11.793555 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.793472 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-data-volume\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.793555 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.793512 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.793799 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.793708 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-crio-socket\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.793799 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.793768 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.793909 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.793812 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mqq2\" (UniqueName: \"kubernetes.io/projected/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-kube-api-access-2mqq2\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.793909 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.793868 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-data-volume\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.794296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.794065 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-crio-socket\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.794418 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.794094 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.796509 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.796489 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.802963 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.802940 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mqq2\" (UniqueName: \"kubernetes.io/projected/6609b3c4-cfd3-49ff-a1e6-faff9ad000b9-kube-api-access-2mqq2\") pod \"insights-runtime-extractor-5skr5\" (UID: \"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9\") " pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:11.823785 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.823752 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:11.829688 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.829665 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dqmzc" Apr 22 17:54:11.919880 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:11.919851 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5skr5" Apr 22 17:54:12.299019 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.298989 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5c4j\" (UniqueName: \"kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j\") pod \"network-check-target-p96l2\" (UID: \"e510245b-cd68-4ee7-8f7d-e72ddbd61118\") " pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:12.299184 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.299054 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:54:12.301555 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.301532 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c266f31e-39da-4b15-a687-f1304c2e67b7-metrics-certs\") pod \"network-metrics-daemon-xk988\" (UID: \"c266f31e-39da-4b15-a687-f1304c2e67b7\") " pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:54:12.301889 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.301873 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5c4j\" (UniqueName: \"kubernetes.io/projected/e510245b-cd68-4ee7-8f7d-e72ddbd61118-kube-api-access-d5c4j\") pod \"network-check-target-p96l2\" (UID: \"e510245b-cd68-4ee7-8f7d-e72ddbd61118\") " pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:12.473935 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.473906 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dqmzc"] Apr 22 17:54:12.478480 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.478456 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pbfbf"] Apr 22 17:54:12.479374 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.479309 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5skr5"] Apr 22 17:54:12.489942 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:54:12.489914 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb6e7b89_b68b_4920_9aaa_d35998aee879.slice/crio-af51a7e5d674659090ed9eff154c56cce9db83f042b81080c59f1fbe9b74ce23 WatchSource:0}: Error finding container af51a7e5d674659090ed9eff154c56cce9db83f042b81080c59f1fbe9b74ce23: Status 404 returned error can't find the container with id af51a7e5d674659090ed9eff154c56cce9db83f042b81080c59f1fbe9b74ce23 Apr 22 17:54:12.555610 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.555545 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw"] Apr 22 17:54:12.567914 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.567892 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.571309 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.571286 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 17:54:12.571416 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.571310 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:12.571938 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.571829 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:54:12.572271 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.572151 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:54:12.573642 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.573284 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:54:12.577014 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.576991 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xk988" Apr 22 17:54:12.581450 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.578088 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-cgw7m\"" Apr 22 17:54:12.581450 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.578123 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 17:54:12.583600 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.581895 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw"] Apr 22 17:54:12.597055 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.596927 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-98j6g"] Apr 22 17:54:12.616328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.615572 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.619217 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.618944 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 17:54:12.619217 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.618989 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 17:54:12.619217 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.619005 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-6r5wl\"" Apr 22 17:54:12.620066 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.619869 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 17:54:12.622613 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.622568 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-98j6g"] Apr 22 17:54:12.647097 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.646857 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-464cb"] Apr 22 17:54:12.673748 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.672947 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.683852 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.678321 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:54:12.683852 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.678705 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:54:12.683852 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.678993 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zb7n5\"" Apr 22 17:54:12.684169 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.683905 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:54:12.701403 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.701377 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.701524 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.701411 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.701524 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.701433 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.701524 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.701455 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.701524 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.701469 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbwqw\" (UniqueName: \"kubernetes.io/projected/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-kube-api-access-tbwqw\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.701524 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.701487 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh8b2\" (UniqueName: \"kubernetes.io/projected/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-api-access-qh8b2\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.701524 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.701515 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.701898 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.701537 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.701898 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.701555 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.701898 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.701570 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.760278 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.760251 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-p96l2"] Apr 22 17:54:12.767073 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.767050 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xk988"] Apr 22 17:54:12.770442 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:54:12.770412 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc266f31e_39da_4b15_a687_f1304c2e67b7.slice/crio-c9338ec3b296460202f363f8a444161b0c4295360ecba0117a827c6e3e81d31a WatchSource:0}: Error finding container c9338ec3b296460202f363f8a444161b0c4295360ecba0117a827c6e3e81d31a: Status 404 returned error can't find the container with id c9338ec3b296460202f363f8a444161b0c4295360ecba0117a827c6e3e81d31a Apr 22 17:54:12.770993 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:54:12.770965 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode510245b_cd68_4ee7_8f7d_e72ddbd61118.slice/crio-44adf628bd6dfdbf6cbbe7cf2c2e3c03b480e492c0423c5f2cf37d8c3d092e06 WatchSource:0}: Error finding container 44adf628bd6dfdbf6cbbe7cf2c2e3c03b480e492c0423c5f2cf37d8c3d092e06: Status 404 returned error can't find the container with id 44adf628bd6dfdbf6cbbe7cf2c2e3c03b480e492c0423c5f2cf37d8c3d092e06 Apr 22 17:54:12.802257 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802234 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.802338 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802269 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-metrics-client-ca\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.802338 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802289 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.802338 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802321 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.802483 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802379 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-root\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.802483 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:12.802413 2583 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 17:54:12.802483 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802423 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-tls\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.802483 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802452 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.802483 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:12.802465 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-tls podName:c6252ec2-1bae-4d24-83c8-f43a6bdb5885 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:13.302445715 +0000 UTC m=+35.179714877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-98j6g" (UID: "c6252ec2-1bae-4d24-83c8-f43a6bdb5885") : secret "kube-state-metrics-tls" not found Apr 22 17:54:12.802724 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802505 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.802724 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802559 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.802724 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802593 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.802724 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802646 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh8b2\" (UniqueName: \"kubernetes.io/projected/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-api-access-qh8b2\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.802724 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802672 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-accelerators-collector-config\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.802724 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802706 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbwqw\" (UniqueName: \"kubernetes.io/projected/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-kube-api-access-tbwqw\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.802995 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802729 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxz4c\" (UniqueName: \"kubernetes.io/projected/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-kube-api-access-cxz4c\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.802995 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802757 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-textfile\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.802995 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802786 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-sys\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.802995 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802820 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-wtmp\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.802995 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802846 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.802995 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.802876 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.803273 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.803187 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.803327 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.803300 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.803375 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.803324 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.803607 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.803556 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.806277 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.806228 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.806422 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.806400 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.806487 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.806423 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.812336 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.812314 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbwqw\" (UniqueName: \"kubernetes.io/projected/fb50ea50-20b2-41eb-8fc1-e4b99b72dee3-kube-api-access-tbwqw\") pod \"openshift-state-metrics-9d44df66c-dr4qw\" (UID: \"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.812768 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.812746 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh8b2\" (UniqueName: \"kubernetes.io/projected/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-api-access-qh8b2\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:12.868442 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.868411 2583 generic.go:358] "Generic (PLEG): container finished" podID="dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4" containerID="5a8acd03e67915e3a6c0ac109be38af208665887420420e47c483aa8e18e5d15" exitCode=0 Apr 22 17:54:12.868564 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.868460 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" event={"ID":"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4","Type":"ContainerDied","Data":"5a8acd03e67915e3a6c0ac109be38af208665887420420e47c483aa8e18e5d15"} Apr 22 17:54:12.869637 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.869606 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xk988" event={"ID":"c266f31e-39da-4b15-a687-f1304c2e67b7","Type":"ContainerStarted","Data":"c9338ec3b296460202f363f8a444161b0c4295360ecba0117a827c6e3e81d31a"} Apr 22 17:54:12.870585 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.870561 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dqmzc" event={"ID":"e2b78a02-ed41-4b77-adce-5c8606af896b","Type":"ContainerStarted","Data":"756a5444ecc836ab2a2bad4d37ac8131b821fadb1ada120f8702ffc9fecfdc16"} Apr 22 17:54:12.871836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.871814 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5skr5" event={"ID":"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9","Type":"ContainerStarted","Data":"95fa9a9d8533f69dc2f44a19a66554107cf0d62884c4b65a167eb22bee901481"} Apr 22 17:54:12.871836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.871842 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5skr5" event={"ID":"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9","Type":"ContainerStarted","Data":"2c27a9ece1c530e6c29098dbfad21190d47b8ffe0650dd82cfe1d79d550cb4d6"} Apr 22 17:54:12.872810 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.872787 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pbfbf" event={"ID":"db6e7b89-b68b-4920-9aaa-d35998aee879","Type":"ContainerStarted","Data":"af51a7e5d674659090ed9eff154c56cce9db83f042b81080c59f1fbe9b74ce23"} Apr 22 17:54:12.873814 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.873794 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-p96l2" event={"ID":"e510245b-cd68-4ee7-8f7d-e72ddbd61118","Type":"ContainerStarted","Data":"44adf628bd6dfdbf6cbbe7cf2c2e3c03b480e492c0423c5f2cf37d8c3d092e06"} Apr 22 17:54:12.882909 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.882894 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" Apr 22 17:54:12.903995 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.903965 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904117 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904021 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-accelerators-collector-config\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904117 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904068 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxz4c\" (UniqueName: \"kubernetes.io/projected/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-kube-api-access-cxz4c\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904117 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904094 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-textfile\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904260 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904121 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-sys\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904260 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904153 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-wtmp\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904260 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904188 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-metrics-client-ca\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904260 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904243 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-root\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904448 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904283 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-tls\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904587 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904500 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-wtmp\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904676 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904606 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-sys\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904886 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904849 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-root\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.904986 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.904904 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-accelerators-collector-config\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.905178 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.905133 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-textfile\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.905396 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.905353 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-metrics-client-ca\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.906843 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.906800 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.907668 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.907650 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-node-exporter-tls\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:12.913790 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:12.913564 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxz4c\" (UniqueName: \"kubernetes.io/projected/03bd3116-7dba-4d28-b3ca-6b85602e0bf2-kube-api-access-cxz4c\") pod \"node-exporter-464cb\" (UID: \"03bd3116-7dba-4d28-b3ca-6b85602e0bf2\") " pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:13.003921 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.003834 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-464cb" Apr 22 17:54:13.024214 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:54:13.024184 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03bd3116_7dba_4d28_b3ca_6b85602e0bf2.slice/crio-d69143b6188aaa3e2f8e13a19a48db70215351e34bc21fe18018d0c6a80cb769 WatchSource:0}: Error finding container d69143b6188aaa3e2f8e13a19a48db70215351e34bc21fe18018d0c6a80cb769: Status 404 returned error can't find the container with id d69143b6188aaa3e2f8e13a19a48db70215351e34bc21fe18018d0c6a80cb769 Apr 22 17:54:13.028570 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.028446 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw"] Apr 22 17:54:13.030548 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:54:13.030524 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb50ea50_20b2_41eb_8fc1_e4b99b72dee3.slice/crio-b96078f2a41fc0b9774f41377b7f89c7031505ee02c36fff1e29cefa78e0d68e WatchSource:0}: Error finding container b96078f2a41fc0b9774f41377b7f89c7031505ee02c36fff1e29cefa78e0d68e: Status 404 returned error can't find the container with id b96078f2a41fc0b9774f41377b7f89c7031505ee02c36fff1e29cefa78e0d68e Apr 22 17:54:13.308362 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.308319 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:13.315668 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.315524 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6252ec2-1bae-4d24-83c8-f43a6bdb5885-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-98j6g\" (UID: \"c6252ec2-1bae-4d24-83c8-f43a6bdb5885\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:13.531659 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.531600 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" Apr 22 17:54:13.578356 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.578277 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:54:13.599658 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.599212 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:54:13.599658 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.599416 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.602554 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.602528 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 17:54:13.602795 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.602773 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 17:54:13.602994 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.602964 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 17:54:13.603212 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.603187 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 17:54:13.603399 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.603383 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7thtv\"" Apr 22 17:54:13.603780 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.603763 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 17:54:13.604031 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.604014 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 17:54:13.604253 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.604236 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 17:54:13.604447 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.604429 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 17:54:13.605576 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.605560 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.710769 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.710819 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.710862 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.710892 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-config-volume\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.710927 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-config-out\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.710959 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.710983 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.711006 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wx8m\" (UniqueName: \"kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-kube-api-access-6wx8m\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.711034 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.711081 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-web-config\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.711108 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.711144 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.712660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.711181 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.812838 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.812939 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-web-config\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.812978 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.813014 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.813048 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.813079 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.813106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.813144 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.813172 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-config-volume\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.813204 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-config-out\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.813240 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.813266 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.814133 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.813289 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wx8m\" (UniqueName: \"kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-kube-api-access-6wx8m\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.819354 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.816311 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.819354 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:54:13.816451 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-trusted-ca-bundle podName:b42658da-172b-46ac-b125-b71c78667e53 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:14.316431026 +0000 UTC m=+36.193700191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "b42658da-172b-46ac-b125-b71c78667e53") : configmap references non-existent config key: ca-bundle.crt Apr 22 17:54:13.819354 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.817967 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.827150 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.823791 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.828298 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.828250 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-config-volume\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.830426 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.829086 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-config-out\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.830426 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.829263 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-web-config\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.830426 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.830066 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.830426 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.830332 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.831741 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.830808 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.831741 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.831243 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.831741 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.831681 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.838816 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.836262 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wx8m\" (UniqueName: \"kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-kube-api-access-6wx8m\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:13.885168 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.885129 2583 generic.go:358] "Generic (PLEG): container finished" podID="dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4" containerID="88ed0135175146415f202a2e24851c2e47b339efa1d5abc9e7df39958f56d75e" exitCode=0 Apr 22 17:54:13.885327 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.885262 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" event={"ID":"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4","Type":"ContainerDied","Data":"88ed0135175146415f202a2e24851c2e47b339efa1d5abc9e7df39958f56d75e"} Apr 22 17:54:13.891883 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.891789 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" event={"ID":"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3","Type":"ContainerStarted","Data":"fd2fdf58ab08c9403e38f7d4b372a949eb2848e46849512b46c92e729c3990af"} Apr 22 17:54:13.891883 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.891828 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" event={"ID":"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3","Type":"ContainerStarted","Data":"7548e99730688870784f009e01847269c4b15afbf8a392439328c69c7008eaa8"} Apr 22 17:54:13.891883 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.891844 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" event={"ID":"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3","Type":"ContainerStarted","Data":"b96078f2a41fc0b9774f41377b7f89c7031505ee02c36fff1e29cefa78e0d68e"} Apr 22 17:54:13.896646 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.896596 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-464cb" event={"ID":"03bd3116-7dba-4d28-b3ca-6b85602e0bf2","Type":"ContainerStarted","Data":"d69143b6188aaa3e2f8e13a19a48db70215351e34bc21fe18018d0c6a80cb769"} Apr 22 17:54:13.903899 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:13.903753 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-98j6g"] Apr 22 17:54:13.911776 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:54:13.911750 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6252ec2_1bae_4d24_83c8_f43a6bdb5885.slice/crio-b005b7bc1f37778f0db9d6687c6735300693db9b1d2d8b0f9c9d89d37f69aed0 WatchSource:0}: Error finding container b005b7bc1f37778f0db9d6687c6735300693db9b1d2d8b0f9c9d89d37f69aed0: Status 404 returned error can't find the container with id b005b7bc1f37778f0db9d6687c6735300693db9b1d2d8b0f9c9d89d37f69aed0 Apr 22 17:54:14.317806 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.317702 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:14.318879 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.318844 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:14.532420 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.532369 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:54:14.600842 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.600718 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-76d946b44c-6cv98"] Apr 22 17:54:14.612786 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.612756 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.615789 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.615755 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 17:54:14.616058 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.616041 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 17:54:14.616129 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.616087 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-4d2g8\"" Apr 22 17:54:14.616291 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.616270 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 17:54:14.617710 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.617688 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 17:54:14.622686 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.618653 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 17:54:14.622686 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.619559 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-dbilide1umfcd\"" Apr 22 17:54:14.623378 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.623316 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-76d946b44c-6cv98"] Apr 22 17:54:14.720300 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.720264 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.720451 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.720314 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-metrics-client-ca\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.720451 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.720382 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2snss\" (UniqueName: \"kubernetes.io/projected/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-kube-api-access-2snss\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.720451 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.720427 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-grpc-tls\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.720646 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.720490 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.720646 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.720531 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.720646 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.720563 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-tls\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.720646 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.720607 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.821293 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.821204 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-metrics-client-ca\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.821293 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.821255 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2snss\" (UniqueName: \"kubernetes.io/projected/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-kube-api-access-2snss\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.821543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.821380 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-grpc-tls\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.821543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.821460 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.821543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.821509 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.821543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.821539 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-tls\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.821784 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.821577 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.821784 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.821670 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.822064 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.822017 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-metrics-client-ca\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.825516 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.825349 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.825516 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.825360 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.825516 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.825357 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-tls\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.825516 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.825436 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.825516 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.825450 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-grpc-tls\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.826702 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.826659 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.830967 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.830926 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2snss\" (UniqueName: \"kubernetes.io/projected/50b8ed86-51c1-40bb-87e4-aeb8c84e9639-kube-api-access-2snss\") pod \"thanos-querier-76d946b44c-6cv98\" (UID: \"50b8ed86-51c1-40bb-87e4-aeb8c84e9639\") " pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.900970 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.900941 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5skr5" event={"ID":"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9","Type":"ContainerStarted","Data":"3a88f4ab89fea57172bf7aa6c8729372b30fdeb0fa8b757d1183f4de6eb25b07"} Apr 22 17:54:14.902310 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.902280 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" event={"ID":"c6252ec2-1bae-4d24-83c8-f43a6bdb5885","Type":"ContainerStarted","Data":"b005b7bc1f37778f0db9d6687c6735300693db9b1d2d8b0f9c9d89d37f69aed0"} Apr 22 17:54:14.905580 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.905556 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" event={"ID":"dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4","Type":"ContainerStarted","Data":"8bf0493b49fc8bbef1bdc05e507d6ebc08875f1e0c564ea19f5641fd7b9197e3"} Apr 22 17:54:14.928137 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.928112 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:14.928239 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:14.928099 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vh7jd" podStartSLOduration=5.762429471 podStartE2EDuration="36.928086645s" podCreationTimestamp="2026-04-22 17:53:38 +0000 UTC" firstStartedPulling="2026-04-22 17:53:41.208883402 +0000 UTC m=+3.086152549" lastFinishedPulling="2026-04-22 17:54:12.374540569 +0000 UTC m=+34.251809723" observedRunningTime="2026-04-22 17:54:14.927013459 +0000 UTC m=+36.804282630" watchObservedRunningTime="2026-04-22 17:54:14.928086645 +0000 UTC m=+36.805355813" Apr 22 17:54:17.128832 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.128648 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7dbd6fccb-srbbj"] Apr 22 17:54:17.140744 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.140716 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.143116 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.143085 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7dbd6fccb-srbbj"] Apr 22 17:54:17.143659 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.143613 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 17:54:17.143659 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.143646 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 17:54:17.143659 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.143654 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 17:54:17.144976 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.144812 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 17:54:17.144976 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.144820 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-3v6g0odm5fub\"" Apr 22 17:54:17.144976 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.144866 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-2fbt8\"" Apr 22 17:54:17.244812 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.244776 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/db7aa446-cb16-4cf3-8ab1-3215184cf20c-secret-metrics-server-client-certs\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.244993 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.244818 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/db7aa446-cb16-4cf3-8ab1-3215184cf20c-secret-metrics-server-tls\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.244993 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.244927 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/db7aa446-cb16-4cf3-8ab1-3215184cf20c-metrics-server-audit-profiles\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.245100 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.245002 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/db7aa446-cb16-4cf3-8ab1-3215184cf20c-audit-log\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.245100 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.245037 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7aa446-cb16-4cf3-8ab1-3215184cf20c-client-ca-bundle\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.245100 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.245070 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgtvz\" (UniqueName: \"kubernetes.io/projected/db7aa446-cb16-4cf3-8ab1-3215184cf20c-kube-api-access-rgtvz\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.245239 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.245156 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db7aa446-cb16-4cf3-8ab1-3215184cf20c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.345960 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.345929 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/db7aa446-cb16-4cf3-8ab1-3215184cf20c-audit-log\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.345960 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.345969 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7aa446-cb16-4cf3-8ab1-3215184cf20c-client-ca-bundle\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.346302 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.345992 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgtvz\" (UniqueName: \"kubernetes.io/projected/db7aa446-cb16-4cf3-8ab1-3215184cf20c-kube-api-access-rgtvz\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.346302 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.346024 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db7aa446-cb16-4cf3-8ab1-3215184cf20c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.346302 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.346051 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/db7aa446-cb16-4cf3-8ab1-3215184cf20c-secret-metrics-server-client-certs\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.346302 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.346078 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/db7aa446-cb16-4cf3-8ab1-3215184cf20c-secret-metrics-server-tls\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.346302 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.346130 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/db7aa446-cb16-4cf3-8ab1-3215184cf20c-metrics-server-audit-profiles\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.349693 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.349667 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/db7aa446-cb16-4cf3-8ab1-3215184cf20c-audit-log\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.350044 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.350020 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db7aa446-cb16-4cf3-8ab1-3215184cf20c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.350496 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.350471 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/db7aa446-cb16-4cf3-8ab1-3215184cf20c-metrics-server-audit-profiles\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.352065 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.352043 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db7aa446-cb16-4cf3-8ab1-3215184cf20c-client-ca-bundle\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.352266 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.352242 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/db7aa446-cb16-4cf3-8ab1-3215184cf20c-secret-metrics-server-client-certs\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.352350 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.352267 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/db7aa446-cb16-4cf3-8ab1-3215184cf20c-secret-metrics-server-tls\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.353872 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.353854 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgtvz\" (UniqueName: \"kubernetes.io/projected/db7aa446-cb16-4cf3-8ab1-3215184cf20c-kube-api-access-rgtvz\") pod \"metrics-server-7dbd6fccb-srbbj\" (UID: \"db7aa446-cb16-4cf3-8ab1-3215184cf20c\") " pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.355964 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.355942 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw"] Apr 22 17:54:17.381530 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.381464 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw"] Apr 22 17:54:17.381655 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.381573 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw" Apr 22 17:54:17.384109 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.384088 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 17:54:17.384231 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.384120 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-2dvd2\"" Apr 22 17:54:17.451429 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.451399 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:17.548588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.548556 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e043748c-8906-4b65-9b1c-53c110a7b404-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-7mhlw\" (UID: \"e043748c-8906-4b65-9b1c-53c110a7b404\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw" Apr 22 17:54:17.649078 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.649040 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e043748c-8906-4b65-9b1c-53c110a7b404-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-7mhlw\" (UID: \"e043748c-8906-4b65-9b1c-53c110a7b404\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw" Apr 22 17:54:17.651588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.651560 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e043748c-8906-4b65-9b1c-53c110a7b404-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-7mhlw\" (UID: \"e043748c-8906-4b65-9b1c-53c110a7b404\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw" Apr 22 17:54:17.690512 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:17.690488 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw" Apr 22 17:54:18.789073 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.789040 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:54:18.810245 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.810111 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.811663 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.811515 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:54:18.813525 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.813442 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 17:54:18.813797 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.813780 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 17:54:18.814323 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.814308 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 17:54:18.814471 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.814458 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 17:54:18.814689 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.814672 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 17:54:18.814894 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.814876 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 17:54:18.815182 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.815067 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 17:54:18.815780 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.815757 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 17:54:18.816028 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.815939 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 17:54:18.816320 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.816156 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-995e9epqp30o6\"" Apr 22 17:54:18.816320 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.816165 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-k658d\"" Apr 22 17:54:18.816320 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.816236 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 17:54:18.816320 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.816309 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 17:54:18.817515 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.817337 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 17:54:18.896733 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.896695 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:54:18.899845 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:54:18.899811 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb42658da_172b_46ac_b125_b71c78667e53.slice/crio-21fac3c6aef3aab09e2ed05b8b99fbc8a380ab4b91cfd0ff2e68143b8c788402 WatchSource:0}: Error finding container 21fac3c6aef3aab09e2ed05b8b99fbc8a380ab4b91cfd0ff2e68143b8c788402: Status 404 returned error can't find the container with id 21fac3c6aef3aab09e2ed05b8b99fbc8a380ab4b91cfd0ff2e68143b8c788402 Apr 22 17:54:18.919007 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.918949 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerStarted","Data":"21fac3c6aef3aab09e2ed05b8b99fbc8a380ab4b91cfd0ff2e68143b8c788402"} Apr 22 17:54:18.960563 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.960527 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.960682 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.960650 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wthkj\" (UniqueName: \"kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-kube-api-access-wthkj\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.960753 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.960709 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.960810 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.960764 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.960865 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.960814 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.960865 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.960841 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-web-config\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.960964 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.960916 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-config-out\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.960964 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.960951 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.961061 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.960980 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.961061 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.961004 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.961061 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.961033 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.961185 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.961065 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.961185 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.961104 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.961185 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.961131 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.961185 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.961159 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.961304 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.961189 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-config\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.961304 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.961226 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:18.961304 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:18.961255 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.021967 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.020160 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-76d946b44c-6cv98"] Apr 22 17:54:19.049828 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:54:19.048768 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50b8ed86_51c1_40bb_87e4_aeb8c84e9639.slice/crio-3c94d43cc66c5c9541df21c8a8e35c7dbf5b08404314a59e71a2956de64a0d19 WatchSource:0}: Error finding container 3c94d43cc66c5c9541df21c8a8e35c7dbf5b08404314a59e71a2956de64a0d19: Status 404 returned error can't find the container with id 3c94d43cc66c5c9541df21c8a8e35c7dbf5b08404314a59e71a2956de64a0d19 Apr 22 17:54:19.062176 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.061907 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062176 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.061957 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062176 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.061990 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062176 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062019 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-config\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062176 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062054 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062176 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062080 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062176 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062137 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062176 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062159 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wthkj\" (UniqueName: \"kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-kube-api-access-wthkj\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062192 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062239 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062283 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062310 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-web-config\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062341 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-config-out\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062369 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062401 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062428 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062460 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.062588 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.062495 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.063313 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.063286 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.066245 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.064475 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.066245 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.064643 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.066245 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.066080 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.071542 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.071067 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.071542 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.071219 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.078590 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.078313 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.081337 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.081214 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw"] Apr 22 17:54:19.082038 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.082000 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.088334 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.084219 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.088334 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.084581 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-config\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.088334 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.085156 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-web-config\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.088334 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.085889 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.088697 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.088598 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-config-out\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.090020 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.089114 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.090020 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.089191 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.090020 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.089503 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.090020 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.089677 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.090414 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.090391 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wthkj\" (UniqueName: \"kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-kube-api-access-wthkj\") pod \"prometheus-k8s-0\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.116344 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.116303 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7dbd6fccb-srbbj"] Apr 22 17:54:19.127882 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.125785 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:19.340550 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.340496 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:54:19.352179 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:54:19.352029 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25ffb134_3e79_4f84_b41c_ede7954c7c80.slice/crio-f3dfc9bd8c270adb27dee9fda607468e0dee60aa41218c18c8f741124b428157 WatchSource:0}: Error finding container f3dfc9bd8c270adb27dee9fda607468e0dee60aa41218c18c8f741124b428157: Status 404 returned error can't find the container with id f3dfc9bd8c270adb27dee9fda607468e0dee60aa41218c18c8f741124b428157 Apr 22 17:54:19.927389 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.927349 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" event={"ID":"fb50ea50-20b2-41eb-8fc1-e4b99b72dee3","Type":"ContainerStarted","Data":"a4874389233b4395c3f433ff238294598a2bfa7480f3ba95e889831c44b53cbc"} Apr 22 17:54:19.929820 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.929788 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerStarted","Data":"f3dfc9bd8c270adb27dee9fda607468e0dee60aa41218c18c8f741124b428157"} Apr 22 17:54:19.932244 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.932210 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" event={"ID":"50b8ed86-51c1-40bb-87e4-aeb8c84e9639","Type":"ContainerStarted","Data":"3c94d43cc66c5c9541df21c8a8e35c7dbf5b08404314a59e71a2956de64a0d19"} Apr 22 17:54:19.935655 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.935609 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-p96l2" event={"ID":"e510245b-cd68-4ee7-8f7d-e72ddbd61118","Type":"ContainerStarted","Data":"9dd2de2a6bf2b0a3a223aec0bfdf7938b7491a3ab6982a7b9bc5af659097dcc9"} Apr 22 17:54:19.936324 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.936267 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:19.939014 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.938275 2583 generic.go:358] "Generic (PLEG): container finished" podID="03bd3116-7dba-4d28-b3ca-6b85602e0bf2" containerID="be6f861c9a7862e5b514646f19efbab3b4a85d73bc4b5e1a54b2294089f38a14" exitCode=0 Apr 22 17:54:19.939014 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.938342 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-464cb" event={"ID":"03bd3116-7dba-4d28-b3ca-6b85602e0bf2","Type":"ContainerDied","Data":"be6f861c9a7862e5b514646f19efbab3b4a85d73bc4b5e1a54b2294089f38a14"} Apr 22 17:54:19.941170 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.941139 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw" event={"ID":"e043748c-8906-4b65-9b1c-53c110a7b404","Type":"ContainerStarted","Data":"472d6dfef8fb14e780abbb6b1904e2033efecc0852d9312c7a0ce1f6fe34be39"} Apr 22 17:54:19.944292 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.944241 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" event={"ID":"db7aa446-cb16-4cf3-8ab1-3215184cf20c","Type":"ContainerStarted","Data":"5a413fac4ea924506e7fe3f9b02fb1cc0d95271ea37b6b26de1dc002ce174257"} Apr 22 17:54:19.949031 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.948182 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xk988" event={"ID":"c266f31e-39da-4b15-a687-f1304c2e67b7","Type":"ContainerStarted","Data":"54b7ae9c586550a0103b580d8af572ceda69dfebc63e0b42a4655dd62cdaa330"} Apr 22 17:54:19.949031 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.948993 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xk988" event={"ID":"c266f31e-39da-4b15-a687-f1304c2e67b7","Type":"ContainerStarted","Data":"4539779582b01ffa5a7eb21c37df6ccc428294224759125b555cb9527f61e975"} Apr 22 17:54:19.950817 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.950331 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dqmzc" event={"ID":"e2b78a02-ed41-4b77-adce-5c8606af896b","Type":"ContainerStarted","Data":"7425d4e435a703b635f4dadb321088d08729f2d91af840ed3642f095a84883c2"} Apr 22 17:54:19.955201 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.955180 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5skr5" event={"ID":"6609b3c4-cfd3-49ff-a1e6-faff9ad000b9","Type":"ContainerStarted","Data":"f1be4c60388bd8921a5b66c97a2594f2bd58d4f7723120065e31ab5593177f60"} Apr 22 17:54:19.957605 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.957224 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dr4qw" podStartSLOduration=2.387472013 podStartE2EDuration="7.957208886s" podCreationTimestamp="2026-04-22 17:54:12 +0000 UTC" firstStartedPulling="2026-04-22 17:54:13.289336971 +0000 UTC m=+35.166606123" lastFinishedPulling="2026-04-22 17:54:18.859073846 +0000 UTC m=+40.736342996" observedRunningTime="2026-04-22 17:54:19.956095595 +0000 UTC m=+41.833364765" watchObservedRunningTime="2026-04-22 17:54:19.957208886 +0000 UTC m=+41.834478056" Apr 22 17:54:19.958942 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.958919 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pbfbf" event={"ID":"db6e7b89-b68b-4920-9aaa-d35998aee879","Type":"ContainerStarted","Data":"81ab3af3bf74a7df3b014526f6298aa7b261a81523418c07490675d9f54f9314"} Apr 22 17:54:19.959046 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.958951 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pbfbf" event={"ID":"db6e7b89-b68b-4920-9aaa-d35998aee879","Type":"ContainerStarted","Data":"ab21e9b881e748181665c416dc5dd98f65c8b71ab55c570614ec2ba7a2becc61"} Apr 22 17:54:19.961677 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.961566 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:19.964551 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.963961 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" event={"ID":"c6252ec2-1bae-4d24-83c8-f43a6bdb5885","Type":"ContainerStarted","Data":"0fea5f6e140fc6e04f4cca4fdc730a2068df89e364ee124d899d2825cab227db"} Apr 22 17:54:19.964551 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.963988 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" event={"ID":"c6252ec2-1bae-4d24-83c8-f43a6bdb5885","Type":"ContainerStarted","Data":"527accdb99a0b9917bd49636344a43360cebf6d5b555c9bbc6bbc5f0ccbea081"} Apr 22 17:54:19.964551 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.964002 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" event={"ID":"c6252ec2-1bae-4d24-83c8-f43a6bdb5885","Type":"ContainerStarted","Data":"8cdd17190faa17ff574ad76228dacc52cc020ca01b3c4b27939a9a3448efb291"} Apr 22 17:54:20.000025 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:19.999784 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dqmzc" podStartSLOduration=2.9165072260000002 podStartE2EDuration="8.999768274s" podCreationTimestamp="2026-04-22 17:54:11 +0000 UTC" firstStartedPulling="2026-04-22 17:54:12.481126064 +0000 UTC m=+34.358395214" lastFinishedPulling="2026-04-22 17:54:18.564387103 +0000 UTC m=+40.441656262" observedRunningTime="2026-04-22 17:54:19.997403286 +0000 UTC m=+41.874672458" watchObservedRunningTime="2026-04-22 17:54:19.999768274 +0000 UTC m=+41.877037444" Apr 22 17:54:20.039463 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:20.039405 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xk988" podStartSLOduration=35.953291583 podStartE2EDuration="42.039388614s" podCreationTimestamp="2026-04-22 17:53:38 +0000 UTC" firstStartedPulling="2026-04-22 17:54:12.77297781 +0000 UTC m=+34.650246962" lastFinishedPulling="2026-04-22 17:54:18.859074829 +0000 UTC m=+40.736343993" observedRunningTime="2026-04-22 17:54:20.017102962 +0000 UTC m=+41.894372132" watchObservedRunningTime="2026-04-22 17:54:20.039388614 +0000 UTC m=+41.916657783" Apr 22 17:54:20.061116 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:20.059659 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5skr5" podStartSLOduration=2.82735158 podStartE2EDuration="9.059604596s" podCreationTimestamp="2026-04-22 17:54:11 +0000 UTC" firstStartedPulling="2026-04-22 17:54:12.653313082 +0000 UTC m=+34.530582243" lastFinishedPulling="2026-04-22 17:54:18.885566098 +0000 UTC m=+40.762835259" observedRunningTime="2026-04-22 17:54:20.058663933 +0000 UTC m=+41.935933102" watchObservedRunningTime="2026-04-22 17:54:20.059604596 +0000 UTC m=+41.936873766" Apr 22 17:54:20.061116 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:20.060795 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-p96l2" podStartSLOduration=35.974620926 podStartE2EDuration="42.060781169s" podCreationTimestamp="2026-04-22 17:53:38 +0000 UTC" firstStartedPulling="2026-04-22 17:54:12.772909186 +0000 UTC m=+34.650178345" lastFinishedPulling="2026-04-22 17:54:18.859069433 +0000 UTC m=+40.736338588" observedRunningTime="2026-04-22 17:54:20.038508343 +0000 UTC m=+41.915777513" watchObservedRunningTime="2026-04-22 17:54:20.060781169 +0000 UTC m=+41.938050338" Apr 22 17:54:20.114691 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:20.114616 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-98j6g" podStartSLOduration=3.146773977 podStartE2EDuration="8.114595682s" podCreationTimestamp="2026-04-22 17:54:12 +0000 UTC" firstStartedPulling="2026-04-22 17:54:13.914372682 +0000 UTC m=+35.791641831" lastFinishedPulling="2026-04-22 17:54:18.882194389 +0000 UTC m=+40.759463536" observedRunningTime="2026-04-22 17:54:20.090846071 +0000 UTC m=+41.968115241" watchObservedRunningTime="2026-04-22 17:54:20.114595682 +0000 UTC m=+41.991864852" Apr 22 17:54:20.115427 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:20.115379 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pbfbf" podStartSLOduration=3.042435707 podStartE2EDuration="9.115366606s" podCreationTimestamp="2026-04-22 17:54:11 +0000 UTC" firstStartedPulling="2026-04-22 17:54:12.491463379 +0000 UTC m=+34.368732525" lastFinishedPulling="2026-04-22 17:54:18.564394273 +0000 UTC m=+40.441663424" observedRunningTime="2026-04-22 17:54:20.11390069 +0000 UTC m=+41.991169860" watchObservedRunningTime="2026-04-22 17:54:20.115366606 +0000 UTC m=+41.992635777" Apr 22 17:54:20.968047 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:20.967995 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-464cb" event={"ID":"03bd3116-7dba-4d28-b3ca-6b85602e0bf2","Type":"ContainerStarted","Data":"42abb88241865598849c52619f89db5f2b2a28cb27babdecd0aa0609b322d725"} Apr 22 17:54:22.979325 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.979290 2583 generic.go:358] "Generic (PLEG): container finished" podID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerID="95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff" exitCode=0 Apr 22 17:54:22.979792 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.979377 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerDied","Data":"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff"} Apr 22 17:54:22.981521 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.981492 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" event={"ID":"50b8ed86-51c1-40bb-87e4-aeb8c84e9639","Type":"ContainerStarted","Data":"b85892496e6d163bc01dbdaaf2fa5e14579ca228d15191174642574bd807cc86"} Apr 22 17:54:22.981635 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.981525 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" event={"ID":"50b8ed86-51c1-40bb-87e4-aeb8c84e9639","Type":"ContainerStarted","Data":"d7f4728a43cee2f5f8fe310da4e164ce361826941dd82feb589ab414d32b5ec1"} Apr 22 17:54:22.981635 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.981535 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" event={"ID":"50b8ed86-51c1-40bb-87e4-aeb8c84e9639","Type":"ContainerStarted","Data":"bb624a6cf4d2e4d0b487986c14204ce92d7efd915cc1e5a6a84c19ef452d277b"} Apr 22 17:54:22.983730 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.983705 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-464cb" event={"ID":"03bd3116-7dba-4d28-b3ca-6b85602e0bf2","Type":"ContainerStarted","Data":"1873d8d1d1f36a5ea5af5be84b7bca73a76be2b62544096a394af6340f327bad"} Apr 22 17:54:22.985509 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.985487 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw" event={"ID":"e043748c-8906-4b65-9b1c-53c110a7b404","Type":"ContainerStarted","Data":"bf1590d3b3542442d12a504abf2a6cd882c6bc231e31e800c4ddcd42832156c5"} Apr 22 17:54:22.985652 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.985635 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw" Apr 22 17:54:22.986960 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.986936 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" event={"ID":"db7aa446-cb16-4cf3-8ab1-3215184cf20c","Type":"ContainerStarted","Data":"6b73d502e93575a20c752c75d19b710e9491b4a32959ce896203c8166cf07ac9"} Apr 22 17:54:22.988791 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.988766 2583 generic.go:358] "Generic (PLEG): container finished" podID="b42658da-172b-46ac-b125-b71c78667e53" containerID="15ab54b3a921d6ad199a54bc98a4c561cc61a648d4a9b485cacedc1608eb687d" exitCode=0 Apr 22 17:54:22.988889 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.988824 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerDied","Data":"15ab54b3a921d6ad199a54bc98a4c561cc61a648d4a9b485cacedc1608eb687d"} Apr 22 17:54:22.991309 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:22.991290 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw" Apr 22 17:54:23.029082 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:23.029022 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" podStartSLOduration=2.736073236 podStartE2EDuration="6.029005908s" podCreationTimestamp="2026-04-22 17:54:17 +0000 UTC" firstStartedPulling="2026-04-22 17:54:19.136540418 +0000 UTC m=+41.013809565" lastFinishedPulling="2026-04-22 17:54:22.429473088 +0000 UTC m=+44.306742237" observedRunningTime="2026-04-22 17:54:23.027935239 +0000 UTC m=+44.905204431" watchObservedRunningTime="2026-04-22 17:54:23.029005908 +0000 UTC m=+44.906275080" Apr 22 17:54:23.060258 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:23.060206 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-464cb" podStartSLOduration=5.522463464 podStartE2EDuration="11.060189698s" podCreationTimestamp="2026-04-22 17:54:12 +0000 UTC" firstStartedPulling="2026-04-22 17:54:13.026658216 +0000 UTC m=+34.903927375" lastFinishedPulling="2026-04-22 17:54:18.564384446 +0000 UTC m=+40.441653609" observedRunningTime="2026-04-22 17:54:23.059410116 +0000 UTC m=+44.936679284" watchObservedRunningTime="2026-04-22 17:54:23.060189698 +0000 UTC m=+44.937458867" Apr 22 17:54:23.061113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:23.061076 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7mhlw" podStartSLOduration=2.72916713 podStartE2EDuration="6.061066005s" podCreationTimestamp="2026-04-22 17:54:17 +0000 UTC" firstStartedPulling="2026-04-22 17:54:19.097387817 +0000 UTC m=+40.974656967" lastFinishedPulling="2026-04-22 17:54:22.429286682 +0000 UTC m=+44.306555842" observedRunningTime="2026-04-22 17:54:23.041946904 +0000 UTC m=+44.919216073" watchObservedRunningTime="2026-04-22 17:54:23.061066005 +0000 UTC m=+44.938335203" Apr 22 17:54:23.996874 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:23.996779 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" event={"ID":"50b8ed86-51c1-40bb-87e4-aeb8c84e9639","Type":"ContainerStarted","Data":"79f23e63550bd5425854751591e515b308d0fb0e12764096920fd5b99862b056"} Apr 22 17:54:23.996874 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:23.996825 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" event={"ID":"50b8ed86-51c1-40bb-87e4-aeb8c84e9639","Type":"ContainerStarted","Data":"af17ba6f49aae761be2fc268096046b794d4a9bd1b8127f72daa1a053fae6659"} Apr 22 17:54:25.006011 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:25.005970 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" event={"ID":"50b8ed86-51c1-40bb-87e4-aeb8c84e9639","Type":"ContainerStarted","Data":"819a78176946ffec1afff69868c50b87762938f938acf31ae753a086ab83d93f"} Apr 22 17:54:25.006447 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:25.006251 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:25.009695 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:25.009657 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerStarted","Data":"4277f68a96b1194b01b9ca76ed778738726f64d56fed4d732aa71744d48cb2f4"} Apr 22 17:54:25.009843 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:25.009702 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerStarted","Data":"da3450c2c1b71b19457e50f604a2d144e8a8c4ebe028956b76279518fa8d82b7"} Apr 22 17:54:25.009843 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:25.009718 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerStarted","Data":"6bf90dafd08b20b9b0622aa6d126b18c3320fce8da9ec6f1e0dee0aa30d4dcfd"} Apr 22 17:54:25.009843 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:25.009729 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerStarted","Data":"ea481a32b4a9fa1a5e8b2be892480835ed6d54a8b4e686de05027bf49df3f403"} Apr 22 17:54:25.009843 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:25.009747 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerStarted","Data":"310156fb5c3beb7124051023483ffd89855ddd7a94bab293dc91d2e0bbca6ec6"} Apr 22 17:54:25.009843 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:25.009763 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerStarted","Data":"08018f5cda7250c167c0c0f153bb5826d6659cb4ebb34607b4754d2aed6d1126"} Apr 22 17:54:25.031560 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:25.031515 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" podStartSLOduration=6.441158265 podStartE2EDuration="11.031502301s" podCreationTimestamp="2026-04-22 17:54:14 +0000 UTC" firstStartedPulling="2026-04-22 17:54:19.056352735 +0000 UTC m=+40.933621888" lastFinishedPulling="2026-04-22 17:54:23.646696763 +0000 UTC m=+45.523965924" observedRunningTime="2026-04-22 17:54:25.030663815 +0000 UTC m=+46.907932982" watchObservedRunningTime="2026-04-22 17:54:25.031502301 +0000 UTC m=+46.908771471" Apr 22 17:54:25.064825 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:25.064769 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=6.728110732 podStartE2EDuration="12.06475335s" podCreationTimestamp="2026-04-22 17:54:13 +0000 UTC" firstStartedPulling="2026-04-22 17:54:18.901807136 +0000 UTC m=+40.779076281" lastFinishedPulling="2026-04-22 17:54:24.238449748 +0000 UTC m=+46.115718899" observedRunningTime="2026-04-22 17:54:25.061781798 +0000 UTC m=+46.939050966" watchObservedRunningTime="2026-04-22 17:54:25.06475335 +0000 UTC m=+46.942022519" Apr 22 17:54:26.015537 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:26.015499 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerStarted","Data":"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448"} Apr 22 17:54:26.015903 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:26.015545 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerStarted","Data":"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05"} Apr 22 17:54:27.021879 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:27.021846 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerStarted","Data":"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e"} Apr 22 17:54:27.022214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:27.021887 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerStarted","Data":"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba"} Apr 22 17:54:27.022214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:27.021901 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerStarted","Data":"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be"} Apr 22 17:54:27.022214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:27.021910 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerStarted","Data":"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479"} Apr 22 17:54:27.050660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:27.050582 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.551981745 podStartE2EDuration="9.050562591s" podCreationTimestamp="2026-04-22 17:54:18 +0000 UTC" firstStartedPulling="2026-04-22 17:54:19.355130909 +0000 UTC m=+41.232400056" lastFinishedPulling="2026-04-22 17:54:25.853711739 +0000 UTC m=+47.730980902" observedRunningTime="2026-04-22 17:54:27.047922868 +0000 UTC m=+48.925192058" watchObservedRunningTime="2026-04-22 17:54:27.050562591 +0000 UTC m=+48.927831760" Apr 22 17:54:29.126576 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:29.126538 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:30.974770 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:30.974736 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pbfbf" Apr 22 17:54:31.021795 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:31.021770 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-76d946b44c-6cv98" Apr 22 17:54:35.860780 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:35.860753 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t6kbb" Apr 22 17:54:37.451741 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:37.451704 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:37.451741 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:37.451751 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:51.975227 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:51.975118 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-p96l2" Apr 22 17:54:57.457403 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:57.457361 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:57.461158 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:57.461136 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7dbd6fccb-srbbj" Apr 22 17:54:59.645943 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:59.645888 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:54:59.664404 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:54:59.664377 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:00.137194 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:00.137165 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:00.190610 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:00.190582 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dqmzc_e2b78a02-ed41-4b77-adce-5c8606af896b/serve-healthcheck-canary/0.log" Apr 22 17:55:12.942510 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:12.942463 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:55:12.943897 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:12.943840 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="alertmanager" containerID="cri-o://08018f5cda7250c167c0c0f153bb5826d6659cb4ebb34607b4754d2aed6d1126" gracePeriod=120 Apr 22 17:55:12.944473 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:12.944445 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="prom-label-proxy" containerID="cri-o://4277f68a96b1194b01b9ca76ed778738726f64d56fed4d732aa71744d48cb2f4" gracePeriod=120 Apr 22 17:55:12.945033 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:12.944760 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy-web" containerID="cri-o://ea481a32b4a9fa1a5e8b2be892480835ed6d54a8b4e686de05027bf49df3f403" gracePeriod=120 Apr 22 17:55:12.945033 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:12.944882 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy-metric" containerID="cri-o://da3450c2c1b71b19457e50f604a2d144e8a8c4ebe028956b76279518fa8d82b7" gracePeriod=120 Apr 22 17:55:12.945033 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:12.944944 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy" containerID="cri-o://6bf90dafd08b20b9b0622aa6d126b18c3320fce8da9ec6f1e0dee0aa30d4dcfd" gracePeriod=120 Apr 22 17:55:12.945243 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:12.944771 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="config-reloader" containerID="cri-o://310156fb5c3beb7124051023483ffd89855ddd7a94bab293dc91d2e0bbca6ec6" gracePeriod=120 Apr 22 17:55:13.168605 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:13.168555 2583 generic.go:358] "Generic (PLEG): container finished" podID="b42658da-172b-46ac-b125-b71c78667e53" containerID="4277f68a96b1194b01b9ca76ed778738726f64d56fed4d732aa71744d48cb2f4" exitCode=0 Apr 22 17:55:13.168605 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:13.168596 2583 generic.go:358] "Generic (PLEG): container finished" podID="b42658da-172b-46ac-b125-b71c78667e53" containerID="6bf90dafd08b20b9b0622aa6d126b18c3320fce8da9ec6f1e0dee0aa30d4dcfd" exitCode=0 Apr 22 17:55:13.168605 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:13.168606 2583 generic.go:358] "Generic (PLEG): container finished" podID="b42658da-172b-46ac-b125-b71c78667e53" containerID="310156fb5c3beb7124051023483ffd89855ddd7a94bab293dc91d2e0bbca6ec6" exitCode=0 Apr 22 17:55:13.168868 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:13.168617 2583 generic.go:358] "Generic (PLEG): container finished" podID="b42658da-172b-46ac-b125-b71c78667e53" containerID="08018f5cda7250c167c0c0f153bb5826d6659cb4ebb34607b4754d2aed6d1126" exitCode=0 Apr 22 17:55:13.168868 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:13.168639 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerDied","Data":"4277f68a96b1194b01b9ca76ed778738726f64d56fed4d732aa71744d48cb2f4"} Apr 22 17:55:13.168868 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:13.168683 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerDied","Data":"6bf90dafd08b20b9b0622aa6d126b18c3320fce8da9ec6f1e0dee0aa30d4dcfd"} Apr 22 17:55:13.168868 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:13.168700 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerDied","Data":"310156fb5c3beb7124051023483ffd89855ddd7a94bab293dc91d2e0bbca6ec6"} Apr 22 17:55:13.168868 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:13.168712 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerDied","Data":"08018f5cda7250c167c0c0f153bb5826d6659cb4ebb34607b4754d2aed6d1126"} Apr 22 17:55:14.179173 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.179136 2583 generic.go:358] "Generic (PLEG): container finished" podID="b42658da-172b-46ac-b125-b71c78667e53" containerID="da3450c2c1b71b19457e50f604a2d144e8a8c4ebe028956b76279518fa8d82b7" exitCode=0 Apr 22 17:55:14.179173 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.179168 2583 generic.go:358] "Generic (PLEG): container finished" podID="b42658da-172b-46ac-b125-b71c78667e53" containerID="ea481a32b4a9fa1a5e8b2be892480835ed6d54a8b4e686de05027bf49df3f403" exitCode=0 Apr 22 17:55:14.179680 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.179258 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerDied","Data":"da3450c2c1b71b19457e50f604a2d144e8a8c4ebe028956b76279518fa8d82b7"} Apr 22 17:55:14.179680 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.179288 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerDied","Data":"ea481a32b4a9fa1a5e8b2be892480835ed6d54a8b4e686de05027bf49df3f403"} Apr 22 17:55:14.224744 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.224701 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:14.331450 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331376 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-tls-assets\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.331450 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331427 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-config-out\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.331450 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331451 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-web-config\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.331713 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331468 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-web\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.331713 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331530 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wx8m\" (UniqueName: \"kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-kube-api-access-6wx8m\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.331713 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331556 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-trusted-ca-bundle\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.331713 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331580 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-config-volume\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.331713 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331603 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-main-tls\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.331713 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331649 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-cluster-tls-config\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.331713 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331689 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-metrics-client-ca\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.332061 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331718 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-main-db\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.332061 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331749 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.332061 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.331780 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-metric\") pod \"b42658da-172b-46ac-b125-b71c78667e53\" (UID: \"b42658da-172b-46ac-b125-b71c78667e53\") " Apr 22 17:55:14.332826 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.332517 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:55:14.332826 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.332714 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:14.333738 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.333692 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:55:14.334529 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.334497 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:14.334884 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.334856 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:14.336126 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.336094 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:14.336267 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.336241 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:55:14.336355 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.336338 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-kube-api-access-6wx8m" (OuterVolumeSpecName: "kube-api-access-6wx8m") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "kube-api-access-6wx8m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:55:14.336419 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.336388 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-config-volume" (OuterVolumeSpecName: "config-volume") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:14.336565 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.336546 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-config-out" (OuterVolumeSpecName: "config-out") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:14.336647 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.336619 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:14.340505 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.340469 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:14.346977 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.346937 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-web-config" (OuterVolumeSpecName: "web-config") pod "b42658da-172b-46ac-b125-b71c78667e53" (UID: "b42658da-172b-46ac-b125-b71c78667e53"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:14.432802 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432769 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6wx8m\" (UniqueName: \"kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-kube-api-access-6wx8m\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.432802 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432799 2583 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.433041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432835 2583 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-config-volume\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.433041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432845 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-main-tls\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.433041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432855 2583 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-cluster-tls-config\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.433041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432865 2583 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b42658da-172b-46ac-b125-b71c78667e53-metrics-client-ca\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.433041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432873 2583 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-alertmanager-main-db\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.433041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432882 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.433041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432893 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.433041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432903 2583 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b42658da-172b-46ac-b125-b71c78667e53-tls-assets\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.433041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432911 2583 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b42658da-172b-46ac-b125-b71c78667e53-config-out\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.433041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432920 2583 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-web-config\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:14.433041 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:14.432928 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b42658da-172b-46ac-b125-b71c78667e53-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:15.184945 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.184907 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b42658da-172b-46ac-b125-b71c78667e53","Type":"ContainerDied","Data":"21fac3c6aef3aab09e2ed05b8b99fbc8a380ab4b91cfd0ff2e68143b8c788402"} Apr 22 17:55:15.185298 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.184963 2583 scope.go:117] "RemoveContainer" containerID="4277f68a96b1194b01b9ca76ed778738726f64d56fed4d732aa71744d48cb2f4" Apr 22 17:55:15.185298 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.185026 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.192287 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.192270 2583 scope.go:117] "RemoveContainer" containerID="da3450c2c1b71b19457e50f604a2d144e8a8c4ebe028956b76279518fa8d82b7" Apr 22 17:55:15.199147 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.199131 2583 scope.go:117] "RemoveContainer" containerID="6bf90dafd08b20b9b0622aa6d126b18c3320fce8da9ec6f1e0dee0aa30d4dcfd" Apr 22 17:55:15.205568 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.205545 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:55:15.206447 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.206431 2583 scope.go:117] "RemoveContainer" containerID="ea481a32b4a9fa1a5e8b2be892480835ed6d54a8b4e686de05027bf49df3f403" Apr 22 17:55:15.210297 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.210275 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:55:15.214086 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.214068 2583 scope.go:117] "RemoveContainer" containerID="310156fb5c3beb7124051023483ffd89855ddd7a94bab293dc91d2e0bbca6ec6" Apr 22 17:55:15.220561 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.220546 2583 scope.go:117] "RemoveContainer" containerID="08018f5cda7250c167c0c0f153bb5826d6659cb4ebb34607b4754d2aed6d1126" Apr 22 17:55:15.227271 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.227253 2583 scope.go:117] "RemoveContainer" containerID="15ab54b3a921d6ad199a54bc98a4c561cc61a648d4a9b485cacedc1608eb687d" Apr 22 17:55:15.234897 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.234846 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:55:15.235232 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235216 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="config-reloader" Apr 22 17:55:15.235296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235234 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="config-reloader" Apr 22 17:55:15.235296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235244 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy-metric" Apr 22 17:55:15.235296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235251 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy-metric" Apr 22 17:55:15.235296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235257 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="prom-label-proxy" Apr 22 17:55:15.235296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235262 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="prom-label-proxy" Apr 22 17:55:15.235296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235275 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="init-config-reloader" Apr 22 17:55:15.235296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235280 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="init-config-reloader" Apr 22 17:55:15.235296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235290 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy-web" Apr 22 17:55:15.235296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235296 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy-web" Apr 22 17:55:15.235583 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235305 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="alertmanager" Apr 22 17:55:15.235583 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235310 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="alertmanager" Apr 22 17:55:15.235583 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235318 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy" Apr 22 17:55:15.235583 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235323 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy" Apr 22 17:55:15.235583 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235369 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="config-reloader" Apr 22 17:55:15.235583 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235378 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="alertmanager" Apr 22 17:55:15.235583 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235385 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy-metric" Apr 22 17:55:15.235583 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235392 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="prom-label-proxy" Apr 22 17:55:15.235583 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235399 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy" Apr 22 17:55:15.235583 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.235405 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b42658da-172b-46ac-b125-b71c78667e53" containerName="kube-rbac-proxy-web" Apr 22 17:55:15.240762 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.240745 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.243430 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.243412 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 17:55:15.243514 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.243413 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 17:55:15.243514 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.243446 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 17:55:15.243618 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.243534 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 17:55:15.243618 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.243535 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 17:55:15.243618 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.243566 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7thtv\"" Apr 22 17:55:15.243865 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.243846 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 17:55:15.243927 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.243898 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 17:55:15.243979 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.243846 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 17:55:15.248333 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.248290 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 17:55:15.250785 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.250769 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:55:15.339006 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.338919 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/701fe7be-306c-47eb-b241-4cf8a0e06584-tls-assets\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339006 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.338966 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339006 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.338999 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.339051 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.339100 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-config-volume\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.339116 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.339133 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-web-config\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.339154 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/701fe7be-306c-47eb-b241-4cf8a0e06584-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.339172 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/701fe7be-306c-47eb-b241-4cf8a0e06584-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339214 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.339190 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/701fe7be-306c-47eb-b241-4cf8a0e06584-config-out\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339411 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.339217 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/701fe7be-306c-47eb-b241-4cf8a0e06584-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339411 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.339254 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f255c\" (UniqueName: \"kubernetes.io/projected/701fe7be-306c-47eb-b241-4cf8a0e06584-kube-api-access-f255c\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.339411 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.339292 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440227 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440196 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-config-volume\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440327 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440233 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440327 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440251 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-web-config\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440327 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440266 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/701fe7be-306c-47eb-b241-4cf8a0e06584-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440431 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440325 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/701fe7be-306c-47eb-b241-4cf8a0e06584-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440431 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440368 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/701fe7be-306c-47eb-b241-4cf8a0e06584-config-out\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440431 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440406 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/701fe7be-306c-47eb-b241-4cf8a0e06584-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440563 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440433 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f255c\" (UniqueName: \"kubernetes.io/projected/701fe7be-306c-47eb-b241-4cf8a0e06584-kube-api-access-f255c\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440563 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440468 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440563 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440503 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/701fe7be-306c-47eb-b241-4cf8a0e06584-tls-assets\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440563 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440550 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440785 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440588 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.440785 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.440619 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.441093 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.441070 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/701fe7be-306c-47eb-b241-4cf8a0e06584-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.441093 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.441088 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/701fe7be-306c-47eb-b241-4cf8a0e06584-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.441603 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.441574 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/701fe7be-306c-47eb-b241-4cf8a0e06584-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.443503 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.443387 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/701fe7be-306c-47eb-b241-4cf8a0e06584-config-out\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.443503 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.443490 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-web-config\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.443891 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.443517 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.443891 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.443545 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/701fe7be-306c-47eb-b241-4cf8a0e06584-tls-assets\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.443891 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.443750 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.443891 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.443846 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-config-volume\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.444111 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.444089 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.444170 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.444114 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.445333 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.445313 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/701fe7be-306c-47eb-b241-4cf8a0e06584-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.448563 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.448542 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f255c\" (UniqueName: \"kubernetes.io/projected/701fe7be-306c-47eb-b241-4cf8a0e06584-kube-api-access-f255c\") pod \"alertmanager-main-0\" (UID: \"701fe7be-306c-47eb-b241-4cf8a0e06584\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.551470 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.551437 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:55:15.681736 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:15.681703 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:55:15.684836 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:55:15.684801 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod701fe7be_306c_47eb_b241_4cf8a0e06584.slice/crio-e7586f0cb8996f2418de46fa7f32778eb40f7414154d2c079a4f3a85fb93671e WatchSource:0}: Error finding container e7586f0cb8996f2418de46fa7f32778eb40f7414154d2c079a4f3a85fb93671e: Status 404 returned error can't find the container with id e7586f0cb8996f2418de46fa7f32778eb40f7414154d2c079a4f3a85fb93671e Apr 22 17:55:16.190036 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.189997 2583 generic.go:358] "Generic (PLEG): container finished" podID="701fe7be-306c-47eb-b241-4cf8a0e06584" containerID="c649ed7984993a025e72bf669ca60c47ab1fb24eaa9919068183f74023c09e43" exitCode=0 Apr 22 17:55:16.190389 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.190057 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"701fe7be-306c-47eb-b241-4cf8a0e06584","Type":"ContainerDied","Data":"c649ed7984993a025e72bf669ca60c47ab1fb24eaa9919068183f74023c09e43"} Apr 22 17:55:16.190389 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.190077 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"701fe7be-306c-47eb-b241-4cf8a0e06584","Type":"ContainerStarted","Data":"e7586f0cb8996f2418de46fa7f32778eb40f7414154d2c079a4f3a85fb93671e"} Apr 22 17:55:16.665638 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.665592 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b42658da-172b-46ac-b125-b71c78667e53" path="/var/lib/kubelet/pods/b42658da-172b-46ac-b125-b71c78667e53/volumes" Apr 22 17:55:16.899920 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.899831 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h"] Apr 22 17:55:16.903515 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.903491 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:16.906160 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.906137 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 17:55:16.906419 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.906402 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 17:55:16.906765 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.906737 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 17:55:16.906859 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.906763 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 17:55:16.907228 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.907201 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 17:55:16.907302 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.907202 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-mfm5t\"" Apr 22 17:55:16.911237 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.911192 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 17:55:16.913490 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:16.913470 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h"] Apr 22 17:55:17.055440 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.055403 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c44d7ace-2367-4456-b866-2a706fd03e27-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.055653 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.055465 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-federate-client-tls\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.055653 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.055518 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-secret-telemeter-client\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.055653 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.055596 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c44d7ace-2367-4456-b866-2a706fd03e27-serving-certs-ca-bundle\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.055653 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.055619 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c44d7ace-2367-4456-b866-2a706fd03e27-metrics-client-ca\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.055847 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.055677 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkx42\" (UniqueName: \"kubernetes.io/projected/c44d7ace-2367-4456-b866-2a706fd03e27-kube-api-access-bkx42\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.055847 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.055717 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-telemeter-client-tls\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.055847 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.055753 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.156660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.156559 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c44d7ace-2367-4456-b866-2a706fd03e27-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.156660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.156594 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-federate-client-tls\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.156660 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.156641 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-secret-telemeter-client\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.156931 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.156709 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c44d7ace-2367-4456-b866-2a706fd03e27-serving-certs-ca-bundle\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.156931 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.156735 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c44d7ace-2367-4456-b866-2a706fd03e27-metrics-client-ca\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.156931 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.156759 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkx42\" (UniqueName: \"kubernetes.io/projected/c44d7ace-2367-4456-b866-2a706fd03e27-kube-api-access-bkx42\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.156931 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.156872 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-telemeter-client-tls\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.156931 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.156930 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.157503 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.157476 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c44d7ace-2367-4456-b866-2a706fd03e27-serving-certs-ca-bundle\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.157756 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.157731 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c44d7ace-2367-4456-b866-2a706fd03e27-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.157866 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.157743 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c44d7ace-2367-4456-b866-2a706fd03e27-metrics-client-ca\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.159264 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.159244 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-telemeter-client-tls\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.159372 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.159325 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-federate-client-tls\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.159471 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.159451 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-secret-telemeter-client\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.159508 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.159451 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c44d7ace-2367-4456-b866-2a706fd03e27-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.164797 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.164770 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkx42\" (UniqueName: \"kubernetes.io/projected/c44d7ace-2367-4456-b866-2a706fd03e27-kube-api-access-bkx42\") pod \"telemeter-client-7c4f7df48c-7jl9h\" (UID: \"c44d7ace-2367-4456-b866-2a706fd03e27\") " pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.195255 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.195214 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"701fe7be-306c-47eb-b241-4cf8a0e06584","Type":"ContainerStarted","Data":"caa2a79335c6d87cef8513b00e21cdecc5c85f02a9fd622af3bc8bad8a00c77b"} Apr 22 17:55:17.195255 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.195255 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"701fe7be-306c-47eb-b241-4cf8a0e06584","Type":"ContainerStarted","Data":"ffe4f0adc41d8b6626850c5d6ada1b7f8384fa8895fe7ea9070ba1cedaf672f0"} Apr 22 17:55:17.195743 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.195268 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"701fe7be-306c-47eb-b241-4cf8a0e06584","Type":"ContainerStarted","Data":"bf7a69bfc33f764f661f4084fc1814dce8fb9e6ccfdc3a9b2184c31aa1cf38e8"} Apr 22 17:55:17.195743 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.195280 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"701fe7be-306c-47eb-b241-4cf8a0e06584","Type":"ContainerStarted","Data":"971742a7ea30f116b429432ba8734b159e18d62df7e339a3a4b3ab0e57d88357"} Apr 22 17:55:17.195743 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.195290 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"701fe7be-306c-47eb-b241-4cf8a0e06584","Type":"ContainerStarted","Data":"3e38aefb914338c96aac4f83056a9c44278046799280ad03dcd75aa30a187e2e"} Apr 22 17:55:17.195743 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.195301 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"701fe7be-306c-47eb-b241-4cf8a0e06584","Type":"ContainerStarted","Data":"17139fe96b7a9b878181dd9c59172861fcbfd28ecd5a2b7c1c73e1141ed8f900"} Apr 22 17:55:17.198807 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.198763 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:55:17.199672 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.199288 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="prometheus" containerID="cri-o://a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05" gracePeriod=600 Apr 22 17:55:17.199672 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.199413 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy-thanos" containerID="cri-o://59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e" gracePeriod=600 Apr 22 17:55:17.199672 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.199466 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy" containerID="cri-o://87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba" gracePeriod=600 Apr 22 17:55:17.199672 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.199511 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy-web" containerID="cri-o://779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be" gracePeriod=600 Apr 22 17:55:17.199672 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.199558 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="thanos-sidecar" containerID="cri-o://53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479" gracePeriod=600 Apr 22 17:55:17.199672 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.199653 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="config-reloader" containerID="cri-o://faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448" gracePeriod=600 Apr 22 17:55:17.213385 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.213364 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" Apr 22 17:55:17.237849 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.237807 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.23779367 podStartE2EDuration="2.23779367s" podCreationTimestamp="2026-04-22 17:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:55:17.236531206 +0000 UTC m=+99.113800373" watchObservedRunningTime="2026-04-22 17:55:17.23779367 +0000 UTC m=+99.115062830" Apr 22 17:55:17.347286 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.347259 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h"] Apr 22 17:55:17.350088 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:55:17.350057 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc44d7ace_2367_4456_b866_2a706fd03e27.slice/crio-801f84d41eca0f8bf22a0d757a840c39b8b3c5cce8a1459ad5e8be0ff610c73e WatchSource:0}: Error finding container 801f84d41eca0f8bf22a0d757a840c39b8b3c5cce8a1459ad5e8be0ff610c73e: Status 404 returned error can't find the container with id 801f84d41eca0f8bf22a0d757a840c39b8b3c5cce8a1459ad5e8be0ff610c73e Apr 22 17:55:17.627119 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.627097 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:17.762490 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.762450 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-kube-rbac-proxy\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.762490 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.762494 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-serving-certs-ca-bundle\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.762757 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.762526 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wthkj\" (UniqueName: \"kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-kube-api-access-wthkj\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.762948 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.762901 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-trusted-ca-bundle\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763026 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.762951 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-db\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763026 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.762954 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:55:17.763026 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.762997 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-thanos-prometheus-http-client-file\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763158 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763040 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763158 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763067 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-tls-assets\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763158 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763083 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-rulefiles-0\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763158 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763109 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-grpc-tls\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763158 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763136 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-metrics-client-certs\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763391 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763169 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-tls\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763391 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763214 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-config-out\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763391 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763253 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763391 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763302 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-kubelet-serving-ca-bundle\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763391 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763330 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-metrics-client-ca\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763391 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763355 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-config\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763768 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763402 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-web-config\") pod \"25ffb134-3e79-4f84-b41c-ede7954c7c80\" (UID: \"25ffb134-3e79-4f84-b41c-ede7954c7c80\") " Apr 22 17:55:17.763768 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763425 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:55:17.763768 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763691 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.763768 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.763713 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.764487 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.764457 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:55:17.764948 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.764669 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:17.764948 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.764697 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:55:17.765688 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.765659 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:55:17.768233 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.768207 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-config-out" (OuterVolumeSpecName: "config-out") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:55:17.768333 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.768256 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:17.768333 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.768287 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-kube-api-access-wthkj" (OuterVolumeSpecName: "kube-api-access-wthkj") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "kube-api-access-wthkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:55:17.768333 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.768285 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:17.768653 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.768604 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:55:17.768798 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.768773 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:17.768881 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.768846 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-config" (OuterVolumeSpecName: "config") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:17.769045 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.769025 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:17.769106 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.769076 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:17.769336 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.769317 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:17.769872 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.769848 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:17.779711 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.779683 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-web-config" (OuterVolumeSpecName: "web-config") pod "25ffb134-3e79-4f84-b41c-ede7954c7c80" (UID: "25ffb134-3e79-4f84-b41c-ede7954c7c80"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:55:17.864141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864112 2583 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-config-out\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864138 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864152 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864168 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-configmap-metrics-client-ca\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864178 2583 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-config\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864187 2583 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-web-config\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864199 2583 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-kube-rbac-proxy\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864211 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wthkj\" (UniqueName: \"kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-kube-api-access-wthkj\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864221 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-db\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864231 2583 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864239 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864248 2583 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/25ffb134-3e79-4f84-b41c-ede7954c7c80-tls-assets\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864258 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/25ffb134-3e79-4f84-b41c-ede7954c7c80-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864266 2583 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-grpc-tls\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864278 2583 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-metrics-client-certs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:17.864328 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:17.864290 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/25ffb134-3e79-4f84-b41c-ede7954c7c80-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 17:55:18.200543 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.200503 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" event={"ID":"c44d7ace-2367-4456-b866-2a706fd03e27","Type":"ContainerStarted","Data":"801f84d41eca0f8bf22a0d757a840c39b8b3c5cce8a1459ad5e8be0ff610c73e"} Apr 22 17:55:18.203670 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203643 2583 generic.go:358] "Generic (PLEG): container finished" podID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerID="59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e" exitCode=0 Apr 22 17:55:18.203670 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203668 2583 generic.go:358] "Generic (PLEG): container finished" podID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerID="87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba" exitCode=0 Apr 22 17:55:18.203849 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203677 2583 generic.go:358] "Generic (PLEG): container finished" podID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerID="779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be" exitCode=0 Apr 22 17:55:18.203849 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203685 2583 generic.go:358] "Generic (PLEG): container finished" podID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerID="53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479" exitCode=0 Apr 22 17:55:18.203849 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203692 2583 generic.go:358] "Generic (PLEG): container finished" podID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerID="faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448" exitCode=0 Apr 22 17:55:18.203849 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203699 2583 generic.go:358] "Generic (PLEG): container finished" podID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerID="a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05" exitCode=0 Apr 22 17:55:18.203849 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203727 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerDied","Data":"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e"} Apr 22 17:55:18.203849 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203770 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerDied","Data":"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba"} Apr 22 17:55:18.203849 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203786 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerDied","Data":"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be"} Apr 22 17:55:18.203849 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203795 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.203849 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203817 2583 scope.go:117] "RemoveContainer" containerID="59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e" Apr 22 17:55:18.204184 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203803 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerDied","Data":"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479"} Apr 22 17:55:18.204184 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203925 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerDied","Data":"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448"} Apr 22 17:55:18.204184 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203953 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerDied","Data":"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05"} Apr 22 17:55:18.204184 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.203967 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25ffb134-3e79-4f84-b41c-ede7954c7c80","Type":"ContainerDied","Data":"f3dfc9bd8c270adb27dee9fda607468e0dee60aa41218c18c8f741124b428157"} Apr 22 17:55:18.212663 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.212642 2583 scope.go:117] "RemoveContainer" containerID="87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba" Apr 22 17:55:18.220585 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.220563 2583 scope.go:117] "RemoveContainer" containerID="779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be" Apr 22 17:55:18.228486 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.228451 2583 scope.go:117] "RemoveContainer" containerID="53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479" Apr 22 17:55:18.235141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.235060 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:55:18.237620 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.237592 2583 scope.go:117] "RemoveContainer" containerID="faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448" Apr 22 17:55:18.238074 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.238029 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:55:18.244593 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.244572 2583 scope.go:117] "RemoveContainer" containerID="a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05" Apr 22 17:55:18.252479 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.252452 2583 scope.go:117] "RemoveContainer" containerID="95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff" Apr 22 17:55:18.259322 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.259304 2583 scope.go:117] "RemoveContainer" containerID="59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e" Apr 22 17:55:18.259582 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:55:18.259562 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": container with ID starting with 59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e not found: ID does not exist" containerID="59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e" Apr 22 17:55:18.259672 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.259604 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e"} err="failed to get container status \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": rpc error: code = NotFound desc = could not find container \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": container with ID starting with 59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e not found: ID does not exist" Apr 22 17:55:18.259672 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.259661 2583 scope.go:117] "RemoveContainer" containerID="87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba" Apr 22 17:55:18.259922 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:55:18.259902 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": container with ID starting with 87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba not found: ID does not exist" containerID="87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba" Apr 22 17:55:18.259967 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.259928 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba"} err="failed to get container status \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": rpc error: code = NotFound desc = could not find container \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": container with ID starting with 87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba not found: ID does not exist" Apr 22 17:55:18.259967 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.259946 2583 scope.go:117] "RemoveContainer" containerID="779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be" Apr 22 17:55:18.260201 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:55:18.260185 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": container with ID starting with 779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be not found: ID does not exist" containerID="779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be" Apr 22 17:55:18.260236 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.260207 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be"} err="failed to get container status \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": rpc error: code = NotFound desc = could not find container \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": container with ID starting with 779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be not found: ID does not exist" Apr 22 17:55:18.260236 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.260222 2583 scope.go:117] "RemoveContainer" containerID="53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479" Apr 22 17:55:18.260432 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:55:18.260414 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": container with ID starting with 53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479 not found: ID does not exist" containerID="53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479" Apr 22 17:55:18.260476 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.260437 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479"} err="failed to get container status \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": rpc error: code = NotFound desc = could not find container \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": container with ID starting with 53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479 not found: ID does not exist" Apr 22 17:55:18.260476 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.260452 2583 scope.go:117] "RemoveContainer" containerID="faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448" Apr 22 17:55:18.260697 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:55:18.260680 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": container with ID starting with faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448 not found: ID does not exist" containerID="faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448" Apr 22 17:55:18.260759 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.260705 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448"} err="failed to get container status \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": rpc error: code = NotFound desc = could not find container \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": container with ID starting with faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448 not found: ID does not exist" Apr 22 17:55:18.260759 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.260725 2583 scope.go:117] "RemoveContainer" containerID="a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05" Apr 22 17:55:18.260952 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:55:18.260936 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": container with ID starting with a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05 not found: ID does not exist" containerID="a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05" Apr 22 17:55:18.260995 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.260957 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05"} err="failed to get container status \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": rpc error: code = NotFound desc = could not find container \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": container with ID starting with a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05 not found: ID does not exist" Apr 22 17:55:18.260995 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.260972 2583 scope.go:117] "RemoveContainer" containerID="95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff" Apr 22 17:55:18.261172 ip-10-0-131-69 kubenswrapper[2583]: E0422 17:55:18.261156 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": container with ID starting with 95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff not found: ID does not exist" containerID="95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff" Apr 22 17:55:18.261226 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.261178 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff"} err="failed to get container status \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": rpc error: code = NotFound desc = could not find container \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": container with ID starting with 95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff not found: ID does not exist" Apr 22 17:55:18.261226 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.261197 2583 scope.go:117] "RemoveContainer" containerID="59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e" Apr 22 17:55:18.261440 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.261404 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e"} err="failed to get container status \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": rpc error: code = NotFound desc = could not find container \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": container with ID starting with 59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e not found: ID does not exist" Apr 22 17:55:18.261482 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.261441 2583 scope.go:117] "RemoveContainer" containerID="87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba" Apr 22 17:55:18.261672 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.261655 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba"} err="failed to get container status \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": rpc error: code = NotFound desc = could not find container \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": container with ID starting with 87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba not found: ID does not exist" Apr 22 17:55:18.261720 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.261674 2583 scope.go:117] "RemoveContainer" containerID="779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be" Apr 22 17:55:18.261885 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.261868 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be"} err="failed to get container status \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": rpc error: code = NotFound desc = could not find container \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": container with ID starting with 779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be not found: ID does not exist" Apr 22 17:55:18.261943 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.261887 2583 scope.go:117] "RemoveContainer" containerID="53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479" Apr 22 17:55:18.262101 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.262086 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479"} err="failed to get container status \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": rpc error: code = NotFound desc = could not find container \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": container with ID starting with 53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479 not found: ID does not exist" Apr 22 17:55:18.262146 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.262101 2583 scope.go:117] "RemoveContainer" containerID="faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448" Apr 22 17:55:18.262296 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.262279 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448"} err="failed to get container status \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": rpc error: code = NotFound desc = could not find container \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": container with ID starting with faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448 not found: ID does not exist" Apr 22 17:55:18.262336 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.262296 2583 scope.go:117] "RemoveContainer" containerID="a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05" Apr 22 17:55:18.262494 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.262478 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05"} err="failed to get container status \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": rpc error: code = NotFound desc = could not find container \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": container with ID starting with a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05 not found: ID does not exist" Apr 22 17:55:18.262561 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.262497 2583 scope.go:117] "RemoveContainer" containerID="95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff" Apr 22 17:55:18.262731 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.262714 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff"} err="failed to get container status \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": rpc error: code = NotFound desc = could not find container \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": container with ID starting with 95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff not found: ID does not exist" Apr 22 17:55:18.262782 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.262732 2583 scope.go:117] "RemoveContainer" containerID="59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e" Apr 22 17:55:18.262928 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.262911 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e"} err="failed to get container status \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": rpc error: code = NotFound desc = could not find container \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": container with ID starting with 59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e not found: ID does not exist" Apr 22 17:55:18.262969 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.262929 2583 scope.go:117] "RemoveContainer" containerID="87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba" Apr 22 17:55:18.263141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.263124 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba"} err="failed to get container status \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": rpc error: code = NotFound desc = could not find container \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": container with ID starting with 87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba not found: ID does not exist" Apr 22 17:55:18.263141 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.263140 2583 scope.go:117] "RemoveContainer" containerID="779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be" Apr 22 17:55:18.263306 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.263293 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be"} err="failed to get container status \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": rpc error: code = NotFound desc = could not find container \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": container with ID starting with 779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be not found: ID does not exist" Apr 22 17:55:18.263350 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.263306 2583 scope.go:117] "RemoveContainer" containerID="53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479" Apr 22 17:55:18.263493 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.263477 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479"} err="failed to get container status \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": rpc error: code = NotFound desc = could not find container \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": container with ID starting with 53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479 not found: ID does not exist" Apr 22 17:55:18.263531 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.263495 2583 scope.go:117] "RemoveContainer" containerID="faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448" Apr 22 17:55:18.263739 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.263718 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448"} err="failed to get container status \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": rpc error: code = NotFound desc = could not find container \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": container with ID starting with faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448 not found: ID does not exist" Apr 22 17:55:18.263812 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.263741 2583 scope.go:117] "RemoveContainer" containerID="a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05" Apr 22 17:55:18.263956 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.263937 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05"} err="failed to get container status \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": rpc error: code = NotFound desc = could not find container \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": container with ID starting with a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05 not found: ID does not exist" Apr 22 17:55:18.264013 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.263958 2583 scope.go:117] "RemoveContainer" containerID="95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff" Apr 22 17:55:18.264171 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.264152 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff"} err="failed to get container status \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": rpc error: code = NotFound desc = could not find container \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": container with ID starting with 95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff not found: ID does not exist" Apr 22 17:55:18.264250 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.264172 2583 scope.go:117] "RemoveContainer" containerID="59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e" Apr 22 17:55:18.264350 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.264334 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e"} err="failed to get container status \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": rpc error: code = NotFound desc = could not find container \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": container with ID starting with 59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e not found: ID does not exist" Apr 22 17:55:18.264391 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.264353 2583 scope.go:117] "RemoveContainer" containerID="87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba" Apr 22 17:55:18.264541 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.264524 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba"} err="failed to get container status \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": rpc error: code = NotFound desc = could not find container \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": container with ID starting with 87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba not found: ID does not exist" Apr 22 17:55:18.264604 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.264544 2583 scope.go:117] "RemoveContainer" containerID="779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be" Apr 22 17:55:18.264745 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.264729 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be"} err="failed to get container status \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": rpc error: code = NotFound desc = could not find container \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": container with ID starting with 779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be not found: ID does not exist" Apr 22 17:55:18.264745 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.264744 2583 scope.go:117] "RemoveContainer" containerID="53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479" Apr 22 17:55:18.264959 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.264942 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479"} err="failed to get container status \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": rpc error: code = NotFound desc = could not find container \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": container with ID starting with 53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479 not found: ID does not exist" Apr 22 17:55:18.265060 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.264960 2583 scope.go:117] "RemoveContainer" containerID="faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448" Apr 22 17:55:18.265192 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.265166 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448"} err="failed to get container status \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": rpc error: code = NotFound desc = could not find container \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": container with ID starting with faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448 not found: ID does not exist" Apr 22 17:55:18.265192 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.265192 2583 scope.go:117] "RemoveContainer" containerID="a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05" Apr 22 17:55:18.265723 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.265690 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05"} err="failed to get container status \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": rpc error: code = NotFound desc = could not find container \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": container with ID starting with a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05 not found: ID does not exist" Apr 22 17:55:18.265723 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.265710 2583 scope.go:117] "RemoveContainer" containerID="95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff" Apr 22 17:55:18.266075 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.266012 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff"} err="failed to get container status \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": rpc error: code = NotFound desc = could not find container \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": container with ID starting with 95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff not found: ID does not exist" Apr 22 17:55:18.266075 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.266036 2583 scope.go:117] "RemoveContainer" containerID="59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e" Apr 22 17:55:18.266300 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.266278 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e"} err="failed to get container status \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": rpc error: code = NotFound desc = could not find container \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": container with ID starting with 59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e not found: ID does not exist" Apr 22 17:55:18.266388 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.266312 2583 scope.go:117] "RemoveContainer" containerID="87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba" Apr 22 17:55:18.266572 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.266554 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba"} err="failed to get container status \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": rpc error: code = NotFound desc = could not find container \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": container with ID starting with 87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba not found: ID does not exist" Apr 22 17:55:18.266701 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.266575 2583 scope.go:117] "RemoveContainer" containerID="779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be" Apr 22 17:55:18.266855 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.266834 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be"} err="failed to get container status \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": rpc error: code = NotFound desc = could not find container \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": container with ID starting with 779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be not found: ID does not exist" Apr 22 17:55:18.266903 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.266857 2583 scope.go:117] "RemoveContainer" containerID="53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479" Apr 22 17:55:18.267098 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267072 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479"} err="failed to get container status \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": rpc error: code = NotFound desc = could not find container \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": container with ID starting with 53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479 not found: ID does not exist" Apr 22 17:55:18.267098 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267097 2583 scope.go:117] "RemoveContainer" containerID="faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448" Apr 22 17:55:18.267241 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267220 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:55:18.267371 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267348 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448"} err="failed to get container status \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": rpc error: code = NotFound desc = could not find container \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": container with ID starting with faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448 not found: ID does not exist" Apr 22 17:55:18.267371 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267369 2583 scope.go:117] "RemoveContainer" containerID="a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05" Apr 22 17:55:18.267618 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267597 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05"} err="failed to get container status \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": rpc error: code = NotFound desc = could not find container \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": container with ID starting with a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05 not found: ID does not exist" Apr 22 17:55:18.267691 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267617 2583 scope.go:117] "RemoveContainer" containerID="95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff" Apr 22 17:55:18.267691 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267618 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="prometheus" Apr 22 17:55:18.267691 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267689 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="prometheus" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267707 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy-web" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267713 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy-web" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267728 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267735 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267746 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy-thanos" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267754 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy-thanos" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267764 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="thanos-sidecar" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267772 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="thanos-sidecar" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267783 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="init-config-reloader" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267791 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="init-config-reloader" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267802 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="config-reloader" Apr 22 17:55:18.267836 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267808 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="config-reloader" Apr 22 17:55:18.268362 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267886 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy-thanos" Apr 22 17:55:18.268362 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267882 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff"} err="failed to get container status \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": rpc error: code = NotFound desc = could not find container \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": container with ID starting with 95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff not found: ID does not exist" Apr 22 17:55:18.268362 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267910 2583 scope.go:117] "RemoveContainer" containerID="59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e" Apr 22 17:55:18.268362 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267895 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy" Apr 22 17:55:18.268362 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267961 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="kube-rbac-proxy-web" Apr 22 17:55:18.268362 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267970 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="thanos-sidecar" Apr 22 17:55:18.268362 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267978 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="prometheus" Apr 22 17:55:18.268362 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.267985 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" containerName="config-reloader" Apr 22 17:55:18.268362 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.268116 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e"} err="failed to get container status \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": rpc error: code = NotFound desc = could not find container \"59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e\": container with ID starting with 59293f6e7e6d9964eae53fee1b76ae11c8bd762f79922f5438ad2c19fc55e72e not found: ID does not exist" Apr 22 17:55:18.268362 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.268136 2583 scope.go:117] "RemoveContainer" containerID="87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba" Apr 22 17:55:18.268688 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.268380 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba"} err="failed to get container status \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": rpc error: code = NotFound desc = could not find container \"87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba\": container with ID starting with 87ae991672cc091c9cb700947c8f5208fdf0cf2ca3f84c48d88add9adb3f30ba not found: ID does not exist" Apr 22 17:55:18.268688 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.268402 2583 scope.go:117] "RemoveContainer" containerID="779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be" Apr 22 17:55:18.268688 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.268657 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be"} err="failed to get container status \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": rpc error: code = NotFound desc = could not find container \"779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be\": container with ID starting with 779dfca35738414d4da9e2a1b5d923a349ef2212b23d1ac5616c35bf7e2857be not found: ID does not exist" Apr 22 17:55:18.268688 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.268674 2583 scope.go:117] "RemoveContainer" containerID="53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479" Apr 22 17:55:18.268913 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.268892 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479"} err="failed to get container status \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": rpc error: code = NotFound desc = could not find container \"53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479\": container with ID starting with 53b0322c3974efe3d2a476a10f08eb1c5dce6c26acd22d38979f08522106d479 not found: ID does not exist" Apr 22 17:55:18.268953 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.268915 2583 scope.go:117] "RemoveContainer" containerID="faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448" Apr 22 17:55:18.269140 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.269117 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448"} err="failed to get container status \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": rpc error: code = NotFound desc = could not find container \"faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448\": container with ID starting with faf972b90e29284175bd8f74410c06af97f34274c7369526f28a058bb0cdd448 not found: ID does not exist" Apr 22 17:55:18.269184 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.269142 2583 scope.go:117] "RemoveContainer" containerID="a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05" Apr 22 17:55:18.269386 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.269368 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05"} err="failed to get container status \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": rpc error: code = NotFound desc = could not find container \"a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05\": container with ID starting with a402ccf86fcd21ee0d31bf65470e72bf70dc97240d51d7aafea60a2e1dadba05 not found: ID does not exist" Apr 22 17:55:18.269441 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.269387 2583 scope.go:117] "RemoveContainer" containerID="95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff" Apr 22 17:55:18.269598 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.269580 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff"} err="failed to get container status \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": rpc error: code = NotFound desc = could not find container \"95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff\": container with ID starting with 95eeda473a3e84a202c480a3b94887e89ecc7a34c055890e08136c30430539ff not found: ID does not exist" Apr 22 17:55:18.273167 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.273153 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.275997 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.275979 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 17:55:18.276479 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.276447 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 17:55:18.276647 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.276611 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 17:55:18.276941 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.276919 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-995e9epqp30o6\"" Apr 22 17:55:18.277157 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.277139 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 17:55:18.277483 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.277461 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 17:55:18.277594 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.277484 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-k658d\"" Apr 22 17:55:18.277594 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.277466 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 17:55:18.277594 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.277466 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 17:55:18.277594 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.277466 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 17:55:18.278990 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.278971 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 17:55:18.279070 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.279048 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 17:55:18.280365 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.280348 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 17:55:18.283459 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.283443 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 17:55:18.286284 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.286265 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:55:18.367797 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.367759 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b0115df0-282f-469a-afef-106d26ba3616-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.367956 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.367823 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368044 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.367989 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-web-config\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368044 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368028 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368148 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368077 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-config\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368185 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368164 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368218 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368203 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368263 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368247 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368317 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368300 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b0115df0-282f-469a-afef-106d26ba3616-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368368 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368353 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmx8l\" (UniqueName: \"kubernetes.io/projected/b0115df0-282f-469a-afef-106d26ba3616-kube-api-access-cmx8l\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368415 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368402 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368723 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368600 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368723 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368687 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b0115df0-282f-469a-afef-106d26ba3616-config-out\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368888 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368769 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368888 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368808 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368888 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368875 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368990 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368900 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.368990 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.368927 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.469958 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.469925 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.469974 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b0115df0-282f-469a-afef-106d26ba3616-config-out\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470019 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470055 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470113 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470094 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470324 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470118 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470324 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470144 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470324 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470169 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b0115df0-282f-469a-afef-106d26ba3616-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470324 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470191 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470324 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470225 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-web-config\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470324 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470243 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470324 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470265 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-config\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470324 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470298 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470324 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470318 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470759 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470339 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470759 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470370 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b0115df0-282f-469a-afef-106d26ba3616-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470759 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470408 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmx8l\" (UniqueName: \"kubernetes.io/projected/b0115df0-282f-469a-afef-106d26ba3616-kube-api-access-cmx8l\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470759 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470438 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470759 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470696 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.470985 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.470853 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.472825 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.471617 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b0115df0-282f-469a-afef-106d26ba3616-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.472825 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.472463 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.472825 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.472702 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.474233 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.473762 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-web-config\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.474233 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.474194 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.474491 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.474445 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.474599 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.474550 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b0115df0-282f-469a-afef-106d26ba3616-config-out\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.474995 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.474967 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b0115df0-282f-469a-afef-106d26ba3616-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.475450 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.475423 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.476142 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.475799 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b0115df0-282f-469a-afef-106d26ba3616-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.476142 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.476093 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-config\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.476329 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.476305 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.476472 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.476453 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.476603 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.476580 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.478365 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.478346 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b0115df0-282f-469a-afef-106d26ba3616-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.479688 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.479670 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmx8l\" (UniqueName: \"kubernetes.io/projected/b0115df0-282f-469a-afef-106d26ba3616-kube-api-access-cmx8l\") pod \"prometheus-k8s-0\" (UID: \"b0115df0-282f-469a-afef-106d26ba3616\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.583149 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.583064 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:55:18.666167 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.666132 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ffb134-3e79-4f84-b41c-ede7954c7c80" path="/var/lib/kubelet/pods/25ffb134-3e79-4f84-b41c-ede7954c7c80/volumes" Apr 22 17:55:18.989290 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:18.989265 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:55:18.993144 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:55:18.993110 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0115df0_282f_469a_afef_106d26ba3616.slice/crio-0c0a585f9cf27b3ccff9b71d6cd052f5ca65e89a54187e8798a07e71626b0e93 WatchSource:0}: Error finding container 0c0a585f9cf27b3ccff9b71d6cd052f5ca65e89a54187e8798a07e71626b0e93: Status 404 returned error can't find the container with id 0c0a585f9cf27b3ccff9b71d6cd052f5ca65e89a54187e8798a07e71626b0e93 Apr 22 17:55:19.208705 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:19.208673 2583 generic.go:358] "Generic (PLEG): container finished" podID="b0115df0-282f-469a-afef-106d26ba3616" containerID="0ca00bd7dafb71882c951fcfb3ed9d11eeae0dcaaa2f567f40b69d0b63702a40" exitCode=0 Apr 22 17:55:19.209057 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:19.208711 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b0115df0-282f-469a-afef-106d26ba3616","Type":"ContainerDied","Data":"0ca00bd7dafb71882c951fcfb3ed9d11eeae0dcaaa2f567f40b69d0b63702a40"} Apr 22 17:55:19.209057 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:19.208741 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b0115df0-282f-469a-afef-106d26ba3616","Type":"ContainerStarted","Data":"0c0a585f9cf27b3ccff9b71d6cd052f5ca65e89a54187e8798a07e71626b0e93"} Apr 22 17:55:19.210864 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:19.210835 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" event={"ID":"c44d7ace-2367-4456-b866-2a706fd03e27","Type":"ContainerStarted","Data":"1649e864db6ab27f9f238baee485e27831e92f424380fc11bbd446ca6045e505"} Apr 22 17:55:19.210997 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:19.210868 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" event={"ID":"c44d7ace-2367-4456-b866-2a706fd03e27","Type":"ContainerStarted","Data":"74b938b0be34bd0684d9786b8c2687d5f3cdc759b531c4b4689f68c33d97b70c"} Apr 22 17:55:19.210997 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:19.210884 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" event={"ID":"c44d7ace-2367-4456-b866-2a706fd03e27","Type":"ContainerStarted","Data":"0348d31de003c321db63b2d1da08a545f5b386cec0bf177929dc2fefaa369e44"} Apr 22 17:55:19.256022 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:19.255971 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7c4f7df48c-7jl9h" podStartSLOduration=1.693461017 podStartE2EDuration="3.255955718s" podCreationTimestamp="2026-04-22 17:55:16 +0000 UTC" firstStartedPulling="2026-04-22 17:55:17.351920825 +0000 UTC m=+99.229189972" lastFinishedPulling="2026-04-22 17:55:18.914415524 +0000 UTC m=+100.791684673" observedRunningTime="2026-04-22 17:55:19.253487831 +0000 UTC m=+101.130757000" watchObservedRunningTime="2026-04-22 17:55:19.255955718 +0000 UTC m=+101.133224886" Apr 22 17:55:20.216892 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:20.216857 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b0115df0-282f-469a-afef-106d26ba3616","Type":"ContainerStarted","Data":"9a97e9f93ce7b8c9d49e43c786dcc90bd4525086dbe0743a1545483a9884e47d"} Apr 22 17:55:20.216892 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:20.216895 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b0115df0-282f-469a-afef-106d26ba3616","Type":"ContainerStarted","Data":"575e827f88d25e20a32e5405f06951098a267987e5f9c23bd55fa68b2c318f44"} Apr 22 17:55:20.217279 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:20.216905 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b0115df0-282f-469a-afef-106d26ba3616","Type":"ContainerStarted","Data":"4a19a89405be0f323e61a05b02d5c9961c5ea4eb8c4d4a475ab6ccba28cff5cf"} Apr 22 17:55:20.217279 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:20.216914 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b0115df0-282f-469a-afef-106d26ba3616","Type":"ContainerStarted","Data":"03824ca57287f790bf8d4daabf090a7979d35216460aafef08f500a51061bdb3"} Apr 22 17:55:20.217279 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:20.216922 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b0115df0-282f-469a-afef-106d26ba3616","Type":"ContainerStarted","Data":"fa4085e2ef03e283a7bac6728fd72abb6507e63fc1b01506db350f0ad9582c96"} Apr 22 17:55:20.217279 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:20.216931 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b0115df0-282f-469a-afef-106d26ba3616","Type":"ContainerStarted","Data":"19b3457a586145f74b354a0812661a9580e38c90b6f2167770896ee43222b7d2"} Apr 22 17:55:20.251578 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:20.251523 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.251507276 podStartE2EDuration="2.251507276s" podCreationTimestamp="2026-04-22 17:55:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:55:20.249173309 +0000 UTC m=+102.126442510" watchObservedRunningTime="2026-04-22 17:55:20.251507276 +0000 UTC m=+102.128776445" Apr 22 17:55:23.584110 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:55:23.584076 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:18.583966 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:56:18.583929 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:18.599394 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:56:18.599370 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:19.401348 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:56:19.401322 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:12.584420 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.584375 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rzw2s"] Apr 22 17:57:12.587677 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.587656 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rzw2s" Apr 22 17:57:12.590422 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.590407 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:57:12.595405 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.595385 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rzw2s"] Apr 22 17:57:12.655453 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.655419 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8b464d48-2213-403b-824b-b5e2c32387b5-dbus\") pod \"global-pull-secret-syncer-rzw2s\" (UID: \"8b464d48-2213-403b-824b-b5e2c32387b5\") " pod="kube-system/global-pull-secret-syncer-rzw2s" Apr 22 17:57:12.655453 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.655459 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8b464d48-2213-403b-824b-b5e2c32387b5-kubelet-config\") pod \"global-pull-secret-syncer-rzw2s\" (UID: \"8b464d48-2213-403b-824b-b5e2c32387b5\") " pod="kube-system/global-pull-secret-syncer-rzw2s" Apr 22 17:57:12.655683 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.655489 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b464d48-2213-403b-824b-b5e2c32387b5-original-pull-secret\") pod \"global-pull-secret-syncer-rzw2s\" (UID: \"8b464d48-2213-403b-824b-b5e2c32387b5\") " pod="kube-system/global-pull-secret-syncer-rzw2s" Apr 22 17:57:12.756795 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.756761 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b464d48-2213-403b-824b-b5e2c32387b5-original-pull-secret\") pod \"global-pull-secret-syncer-rzw2s\" (UID: \"8b464d48-2213-403b-824b-b5e2c32387b5\") " pod="kube-system/global-pull-secret-syncer-rzw2s" Apr 22 17:57:12.756943 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.756838 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8b464d48-2213-403b-824b-b5e2c32387b5-dbus\") pod \"global-pull-secret-syncer-rzw2s\" (UID: \"8b464d48-2213-403b-824b-b5e2c32387b5\") " pod="kube-system/global-pull-secret-syncer-rzw2s" Apr 22 17:57:12.756943 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.756863 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8b464d48-2213-403b-824b-b5e2c32387b5-kubelet-config\") pod \"global-pull-secret-syncer-rzw2s\" (UID: \"8b464d48-2213-403b-824b-b5e2c32387b5\") " pod="kube-system/global-pull-secret-syncer-rzw2s" Apr 22 17:57:12.756943 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.756926 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8b464d48-2213-403b-824b-b5e2c32387b5-kubelet-config\") pod \"global-pull-secret-syncer-rzw2s\" (UID: \"8b464d48-2213-403b-824b-b5e2c32387b5\") " pod="kube-system/global-pull-secret-syncer-rzw2s" Apr 22 17:57:12.757037 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.757013 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8b464d48-2213-403b-824b-b5e2c32387b5-dbus\") pod \"global-pull-secret-syncer-rzw2s\" (UID: \"8b464d48-2213-403b-824b-b5e2c32387b5\") " pod="kube-system/global-pull-secret-syncer-rzw2s" Apr 22 17:57:12.759030 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.759010 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b464d48-2213-403b-824b-b5e2c32387b5-original-pull-secret\") pod \"global-pull-secret-syncer-rzw2s\" (UID: \"8b464d48-2213-403b-824b-b5e2c32387b5\") " pod="kube-system/global-pull-secret-syncer-rzw2s" Apr 22 17:57:12.898020 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:12.897943 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rzw2s" Apr 22 17:57:13.016421 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:13.016386 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rzw2s"] Apr 22 17:57:13.019257 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:57:13.019227 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b464d48_2213_403b_824b_b5e2c32387b5.slice/crio-296faaa1d28fc6d19d1a5fc841ec6be30a8dabc476afc909f594e1f51075c26d WatchSource:0}: Error finding container 296faaa1d28fc6d19d1a5fc841ec6be30a8dabc476afc909f594e1f51075c26d: Status 404 returned error can't find the container with id 296faaa1d28fc6d19d1a5fc841ec6be30a8dabc476afc909f594e1f51075c26d Apr 22 17:57:13.537888 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:13.537845 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rzw2s" event={"ID":"8b464d48-2213-403b-824b-b5e2c32387b5","Type":"ContainerStarted","Data":"296faaa1d28fc6d19d1a5fc841ec6be30a8dabc476afc909f594e1f51075c26d"} Apr 22 17:57:17.551217 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:17.551174 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rzw2s" event={"ID":"8b464d48-2213-403b-824b-b5e2c32387b5","Type":"ContainerStarted","Data":"e39eac7f89af26f22f1346282e82edf14f310143bac18eece35866d725a0c9cd"} Apr 22 17:57:17.566599 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:57:17.566546 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rzw2s" podStartSLOduration=1.9527073499999998 podStartE2EDuration="5.566530975s" podCreationTimestamp="2026-04-22 17:57:12 +0000 UTC" firstStartedPulling="2026-04-22 17:57:13.020945081 +0000 UTC m=+214.898214227" lastFinishedPulling="2026-04-22 17:57:16.634768689 +0000 UTC m=+218.512037852" observedRunningTime="2026-04-22 17:57:17.566200983 +0000 UTC m=+219.443470163" watchObservedRunningTime="2026-04-22 17:57:17.566530975 +0000 UTC m=+219.443800156" Apr 22 17:58:38.559164 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:58:38.559134 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 17:58:38.560245 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:58:38.560219 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 17:58:38.566050 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:58:38.566033 2583 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:59:48.126914 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.126881 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-sswwp"] Apr 22 17:59:48.129701 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.129683 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" Apr 22 17:59:48.132254 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.132231 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-v7l5p\"" Apr 22 17:59:48.132355 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.132236 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 17:59:48.133420 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.133405 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 17:59:48.140055 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.140032 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-sswwp"] Apr 22 17:59:48.269786 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.269745 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14ecede4-b217-4370-a111-69fc9ff899f9-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-sswwp\" (UID: \"14ecede4-b217-4370-a111-69fc9ff899f9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" Apr 22 17:59:48.269963 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.269800 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qhq7\" (UniqueName: \"kubernetes.io/projected/14ecede4-b217-4370-a111-69fc9ff899f9-kube-api-access-5qhq7\") pod \"cert-manager-webhook-587ccfb98-sswwp\" (UID: \"14ecede4-b217-4370-a111-69fc9ff899f9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" Apr 22 17:59:48.371097 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.371045 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14ecede4-b217-4370-a111-69fc9ff899f9-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-sswwp\" (UID: \"14ecede4-b217-4370-a111-69fc9ff899f9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" Apr 22 17:59:48.371097 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.371106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qhq7\" (UniqueName: \"kubernetes.io/projected/14ecede4-b217-4370-a111-69fc9ff899f9-kube-api-access-5qhq7\") pod \"cert-manager-webhook-587ccfb98-sswwp\" (UID: \"14ecede4-b217-4370-a111-69fc9ff899f9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" Apr 22 17:59:48.379221 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.379160 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14ecede4-b217-4370-a111-69fc9ff899f9-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-sswwp\" (UID: \"14ecede4-b217-4370-a111-69fc9ff899f9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" Apr 22 17:59:48.379372 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.379350 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qhq7\" (UniqueName: \"kubernetes.io/projected/14ecede4-b217-4370-a111-69fc9ff899f9-kube-api-access-5qhq7\") pod \"cert-manager-webhook-587ccfb98-sswwp\" (UID: \"14ecede4-b217-4370-a111-69fc9ff899f9\") " pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" Apr 22 17:59:48.446385 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.446353 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" Apr 22 17:59:48.563913 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.563890 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-sswwp"] Apr 22 17:59:48.566410 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:59:48.566377 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ecede4_b217_4370_a111_69fc9ff899f9.slice/crio-850ed9c163940017ca24445eb2f66b25299d960b758532d2429c2bab45c77b01 WatchSource:0}: Error finding container 850ed9c163940017ca24445eb2f66b25299d960b758532d2429c2bab45c77b01: Status 404 returned error can't find the container with id 850ed9c163940017ca24445eb2f66b25299d960b758532d2429c2bab45c77b01 Apr 22 17:59:48.568152 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.568139 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:59:48.976673 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:48.976642 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" event={"ID":"14ecede4-b217-4370-a111-69fc9ff899f9","Type":"ContainerStarted","Data":"850ed9c163940017ca24445eb2f66b25299d960b758532d2429c2bab45c77b01"} Apr 22 17:59:51.988843 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:51.988804 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" event={"ID":"14ecede4-b217-4370-a111-69fc9ff899f9","Type":"ContainerStarted","Data":"1b5569c4c5568ccc7b4e149123bf3ed2c1f26731dde50709b0f540000ce340e2"} Apr 22 17:59:51.989298 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:51.988858 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" Apr 22 17:59:52.004912 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:52.004865 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" podStartSLOduration=1.143596764 podStartE2EDuration="4.004851545s" podCreationTimestamp="2026-04-22 17:59:48 +0000 UTC" firstStartedPulling="2026-04-22 17:59:48.568262938 +0000 UTC m=+370.445532083" lastFinishedPulling="2026-04-22 17:59:51.429517716 +0000 UTC m=+373.306786864" observedRunningTime="2026-04-22 17:59:52.00356483 +0000 UTC m=+373.880834023" watchObservedRunningTime="2026-04-22 17:59:52.004851545 +0000 UTC m=+373.882120713" Apr 22 17:59:53.541580 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.541551 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-nd245"] Apr 22 17:59:53.544888 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.544870 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-nd245" Apr 22 17:59:53.548452 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.548428 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-bms2h\"" Apr 22 17:59:53.555613 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.555592 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-nd245"] Apr 22 17:59:53.617706 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.617669 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qz7q\" (UniqueName: \"kubernetes.io/projected/eb775073-e14b-4afc-9e11-5eb165fe35cc-kube-api-access-2qz7q\") pod \"cert-manager-cainjector-68b757865b-nd245\" (UID: \"eb775073-e14b-4afc-9e11-5eb165fe35cc\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nd245" Apr 22 17:59:53.617876 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.617733 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb775073-e14b-4afc-9e11-5eb165fe35cc-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-nd245\" (UID: \"eb775073-e14b-4afc-9e11-5eb165fe35cc\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nd245" Apr 22 17:59:53.718435 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.718386 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb775073-e14b-4afc-9e11-5eb165fe35cc-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-nd245\" (UID: \"eb775073-e14b-4afc-9e11-5eb165fe35cc\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nd245" Apr 22 17:59:53.718651 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.718486 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qz7q\" (UniqueName: \"kubernetes.io/projected/eb775073-e14b-4afc-9e11-5eb165fe35cc-kube-api-access-2qz7q\") pod \"cert-manager-cainjector-68b757865b-nd245\" (UID: \"eb775073-e14b-4afc-9e11-5eb165fe35cc\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nd245" Apr 22 17:59:53.727112 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.727085 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qz7q\" (UniqueName: \"kubernetes.io/projected/eb775073-e14b-4afc-9e11-5eb165fe35cc-kube-api-access-2qz7q\") pod \"cert-manager-cainjector-68b757865b-nd245\" (UID: \"eb775073-e14b-4afc-9e11-5eb165fe35cc\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nd245" Apr 22 17:59:53.727338 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.727317 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb775073-e14b-4afc-9e11-5eb165fe35cc-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-nd245\" (UID: \"eb775073-e14b-4afc-9e11-5eb165fe35cc\") " pod="cert-manager/cert-manager-cainjector-68b757865b-nd245" Apr 22 17:59:53.853478 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.853401 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-nd245" Apr 22 17:59:53.971060 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.971030 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-nd245"] Apr 22 17:59:53.973947 ip-10-0-131-69 kubenswrapper[2583]: W0422 17:59:53.973919 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb775073_e14b_4afc_9e11_5eb165fe35cc.slice/crio-3c34eddae0eb6bd1cc8b8ada6cc8c64a9e97033a706699f0ade6b854b8987649 WatchSource:0}: Error finding container 3c34eddae0eb6bd1cc8b8ada6cc8c64a9e97033a706699f0ade6b854b8987649: Status 404 returned error can't find the container with id 3c34eddae0eb6bd1cc8b8ada6cc8c64a9e97033a706699f0ade6b854b8987649 Apr 22 17:59:53.996242 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:53.996217 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-nd245" event={"ID":"eb775073-e14b-4afc-9e11-5eb165fe35cc","Type":"ContainerStarted","Data":"3c34eddae0eb6bd1cc8b8ada6cc8c64a9e97033a706699f0ade6b854b8987649"} Apr 22 17:59:55.000772 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:55.000738 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-nd245" event={"ID":"eb775073-e14b-4afc-9e11-5eb165fe35cc","Type":"ContainerStarted","Data":"86878c81d25f64ae26ddfbe0bbdea0628734bf1e4c2cac68a2ec0dafb3588078"} Apr 22 17:59:55.016789 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:55.016744 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-nd245" podStartSLOduration=2.016729019 podStartE2EDuration="2.016729019s" podCreationTimestamp="2026-04-22 17:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:59:55.016703588 +0000 UTC m=+376.893972770" watchObservedRunningTime="2026-04-22 17:59:55.016729019 +0000 UTC m=+376.893998189" Apr 22 17:59:57.993646 ip-10-0-131-69 kubenswrapper[2583]: I0422 17:59:57.993596 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-sswwp" Apr 22 18:00:02.004835 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.004800 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm"] Apr 22 18:00:02.008184 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.008165 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" Apr 22 18:00:02.010817 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.010794 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:00:02.011951 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.011917 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-hhxjl\"" Apr 22 18:00:02.012043 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.011949 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:00:02.019071 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.019049 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm"] Apr 22 18:00:02.195397 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.195363 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef528543-6fab-4648-bad4-f26f355c74c9-tmp\") pod \"openshift-lws-operator-bfc7f696d-jdvrm\" (UID: \"ef528543-6fab-4648-bad4-f26f355c74c9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" Apr 22 18:00:02.195397 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.195400 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6c5x\" (UniqueName: \"kubernetes.io/projected/ef528543-6fab-4648-bad4-f26f355c74c9-kube-api-access-m6c5x\") pod \"openshift-lws-operator-bfc7f696d-jdvrm\" (UID: \"ef528543-6fab-4648-bad4-f26f355c74c9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" Apr 22 18:00:02.296084 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.296014 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef528543-6fab-4648-bad4-f26f355c74c9-tmp\") pod \"openshift-lws-operator-bfc7f696d-jdvrm\" (UID: \"ef528543-6fab-4648-bad4-f26f355c74c9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" Apr 22 18:00:02.296084 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.296048 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6c5x\" (UniqueName: \"kubernetes.io/projected/ef528543-6fab-4648-bad4-f26f355c74c9-kube-api-access-m6c5x\") pod \"openshift-lws-operator-bfc7f696d-jdvrm\" (UID: \"ef528543-6fab-4648-bad4-f26f355c74c9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" Apr 22 18:00:02.296394 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.296376 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef528543-6fab-4648-bad4-f26f355c74c9-tmp\") pod \"openshift-lws-operator-bfc7f696d-jdvrm\" (UID: \"ef528543-6fab-4648-bad4-f26f355c74c9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" Apr 22 18:00:02.304619 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.304588 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6c5x\" (UniqueName: \"kubernetes.io/projected/ef528543-6fab-4648-bad4-f26f355c74c9-kube-api-access-m6c5x\") pod \"openshift-lws-operator-bfc7f696d-jdvrm\" (UID: \"ef528543-6fab-4648-bad4-f26f355c74c9\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" Apr 22 18:00:02.317839 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.317820 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" Apr 22 18:00:02.434785 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:02.434760 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm"] Apr 22 18:00:02.436751 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:00:02.436718 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef528543_6fab_4648_bad4_f26f355c74c9.slice/crio-ead37087514ac96537ed5766adeb5d3b66fce88ce49aac58af507d56fd273b64 WatchSource:0}: Error finding container ead37087514ac96537ed5766adeb5d3b66fce88ce49aac58af507d56fd273b64: Status 404 returned error can't find the container with id ead37087514ac96537ed5766adeb5d3b66fce88ce49aac58af507d56fd273b64 Apr 22 18:00:03.025161 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:03.025115 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" event={"ID":"ef528543-6fab-4648-bad4-f26f355c74c9","Type":"ContainerStarted","Data":"ead37087514ac96537ed5766adeb5d3b66fce88ce49aac58af507d56fd273b64"} Apr 22 18:00:06.345517 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.345483 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-6z5tg"] Apr 22 18:00:06.352301 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.352279 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-6z5tg" Apr 22 18:00:06.356088 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.356068 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-dqd59\"" Apr 22 18:00:06.357961 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.357936 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-6z5tg"] Apr 22 18:00:06.433871 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.433845 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce-bound-sa-token\") pod \"cert-manager-79c8d999ff-6z5tg\" (UID: \"ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce\") " pod="cert-manager/cert-manager-79c8d999ff-6z5tg" Apr 22 18:00:06.434022 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.433893 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqcg2\" (UniqueName: \"kubernetes.io/projected/ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce-kube-api-access-nqcg2\") pod \"cert-manager-79c8d999ff-6z5tg\" (UID: \"ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce\") " pod="cert-manager/cert-manager-79c8d999ff-6z5tg" Apr 22 18:00:06.535269 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.535242 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce-bound-sa-token\") pod \"cert-manager-79c8d999ff-6z5tg\" (UID: \"ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce\") " pod="cert-manager/cert-manager-79c8d999ff-6z5tg" Apr 22 18:00:06.535419 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.535290 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqcg2\" (UniqueName: \"kubernetes.io/projected/ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce-kube-api-access-nqcg2\") pod \"cert-manager-79c8d999ff-6z5tg\" (UID: \"ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce\") " pod="cert-manager/cert-manager-79c8d999ff-6z5tg" Apr 22 18:00:06.544308 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.544278 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqcg2\" (UniqueName: \"kubernetes.io/projected/ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce-kube-api-access-nqcg2\") pod \"cert-manager-79c8d999ff-6z5tg\" (UID: \"ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce\") " pod="cert-manager/cert-manager-79c8d999ff-6z5tg" Apr 22 18:00:06.544416 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.544342 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce-bound-sa-token\") pod \"cert-manager-79c8d999ff-6z5tg\" (UID: \"ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce\") " pod="cert-manager/cert-manager-79c8d999ff-6z5tg" Apr 22 18:00:06.662297 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.662227 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-6z5tg" Apr 22 18:00:06.780767 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:06.780740 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-6z5tg"] Apr 22 18:00:06.783413 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:00:06.783383 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae8ab904_e66f_4fab_86b9_c6c6aa9f3fce.slice/crio-a1b9f1b232408fb26978e20c4fe736bea988242b4454775389b8f7772e6d06a6 WatchSource:0}: Error finding container a1b9f1b232408fb26978e20c4fe736bea988242b4454775389b8f7772e6d06a6: Status 404 returned error can't find the container with id a1b9f1b232408fb26978e20c4fe736bea988242b4454775389b8f7772e6d06a6 Apr 22 18:00:07.037237 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:07.037203 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-6z5tg" event={"ID":"ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce","Type":"ContainerStarted","Data":"1643aa7fc5736135e91e665fbd9a1950ead90781f47e13e62223244c175cfd55"} Apr 22 18:00:07.037237 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:07.037240 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-6z5tg" event={"ID":"ae8ab904-e66f-4fab-86b9-c6c6aa9f3fce","Type":"ContainerStarted","Data":"a1b9f1b232408fb26978e20c4fe736bea988242b4454775389b8f7772e6d06a6"} Apr 22 18:00:07.053507 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:00:07.053453 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-6z5tg" podStartSLOduration=1.053434756 podStartE2EDuration="1.053434756s" podCreationTimestamp="2026-04-22 18:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:00:07.053410458 +0000 UTC m=+388.930679637" watchObservedRunningTime="2026-04-22 18:00:07.053434756 +0000 UTC m=+388.930703926" Apr 22 18:01:03.219607 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:01:03.219553 2583 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309: reading manifest sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309 in registry.redhat.io/leader-worker-set/lws-rhel9-operator: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309" Apr 22 18:01:03.220038 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:01:03.219758 2583 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:openshift-lws-operator,Image:registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309,Command:[lws-operator],Args:[operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAME,Value:openshift-lws-operator,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPERAND_IMAGE,Value:registry.redhat.io/leader-worker-set/lws-rhel9@sha256:affb303b1173c273231bb50fef07310b0e220d2f08bfc0aa5912d0825e3e0d4f,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:leader-worker-set.v1.0.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tmp,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m6c5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-lws-operator-bfc7f696d-jdvrm_openshift-lws-operator(ef528543-6fab-4648-bad4-f26f355c74c9): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309: reading manifest sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309 in registry.redhat.io/leader-worker-set/lws-rhel9-operator: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 22 18:01:03.220932 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:01:03.220904 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-lws-operator\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309: reading manifest sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309 in registry.redhat.io/leader-worker-set/lws-rhel9-operator: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" podUID="ef528543-6fab-4648-bad4-f26f355c74c9" Apr 22 18:01:04.210717 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:01:04.210676 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-lws-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309: reading manifest sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309 in registry.redhat.io/leader-worker-set/lws-rhel9-operator: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" podUID="ef528543-6fab-4648-bad4-f26f355c74c9" Apr 22 18:01:21.263635 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:21.263595 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" event={"ID":"ef528543-6fab-4648-bad4-f26f355c74c9","Type":"ContainerStarted","Data":"446826f91fc7dad1b0232ad3efd751016b3f5db591f63ff12e11d35ff7854e53"} Apr 22 18:01:21.281229 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:21.281176 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-jdvrm" podStartSLOduration=2.015877702 podStartE2EDuration="1m20.281160067s" podCreationTimestamp="2026-04-22 18:00:01 +0000 UTC" firstStartedPulling="2026-04-22 18:00:02.438224318 +0000 UTC m=+384.315493465" lastFinishedPulling="2026-04-22 18:01:20.703506684 +0000 UTC m=+462.580775830" observedRunningTime="2026-04-22 18:01:21.279393215 +0000 UTC m=+463.156662385" watchObservedRunningTime="2026-04-22 18:01:21.281160067 +0000 UTC m=+463.158429234" Apr 22 18:01:32.608451 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.608414 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89"] Apr 22 18:01:32.611878 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.611854 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.614396 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.614363 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 18:01:32.614535 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.614425 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 18:01:32.615760 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.615742 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-wjvx9\"" Apr 22 18:01:32.615830 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.615748 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 18:01:32.623662 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.623615 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89"] Apr 22 18:01:32.781245 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.781207 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-cert\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.781245 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.781246 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfvdx\" (UniqueName: \"kubernetes.io/projected/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-kube-api-access-dfvdx\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.781487 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.781278 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-manager-config\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.781487 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.781402 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-metrics-cert\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.882299 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.882202 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-metrics-cert\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.882299 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.882279 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-cert\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.882299 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.882297 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfvdx\" (UniqueName: \"kubernetes.io/projected/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-kube-api-access-dfvdx\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.882524 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.882317 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-manager-config\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.882967 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.882945 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-manager-config\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.884734 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.884714 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-metrics-cert\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.884826 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.884753 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-cert\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.900068 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.900033 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfvdx\" (UniqueName: \"kubernetes.io/projected/1600c6f0-3eed-49a9-a9d5-12b5e03ed346-kube-api-access-dfvdx\") pod \"lws-controller-manager-5b9bbc5c4d-d2t89\" (UID: \"1600c6f0-3eed-49a9-a9d5-12b5e03ed346\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:32.921882 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:32.921846 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:33.072636 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:33.072493 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89"] Apr 22 18:01:33.075187 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:01:33.075158 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1600c6f0_3eed_49a9_a9d5_12b5e03ed346.slice/crio-82ab43f3564a2472a84d724088de9ee396fca8833be4c69d99a5ebf473374939 WatchSource:0}: Error finding container 82ab43f3564a2472a84d724088de9ee396fca8833be4c69d99a5ebf473374939: Status 404 returned error can't find the container with id 82ab43f3564a2472a84d724088de9ee396fca8833be4c69d99a5ebf473374939 Apr 22 18:01:33.302697 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:33.302656 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" event={"ID":"1600c6f0-3eed-49a9-a9d5-12b5e03ed346","Type":"ContainerStarted","Data":"82ab43f3564a2472a84d724088de9ee396fca8833be4c69d99a5ebf473374939"} Apr 22 18:01:35.311580 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:35.311546 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" event={"ID":"1600c6f0-3eed-49a9-a9d5-12b5e03ed346","Type":"ContainerStarted","Data":"837902ac6140c1f7cb1642a939edd7df462cf1e259bd51641eafcdfd039c2318"} Apr 22 18:01:35.311975 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:35.311746 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:01:35.328858 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:35.328806 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" podStartSLOduration=1.7595933910000001 podStartE2EDuration="3.328790944s" podCreationTimestamp="2026-04-22 18:01:32 +0000 UTC" firstStartedPulling="2026-04-22 18:01:33.076876616 +0000 UTC m=+474.954145761" lastFinishedPulling="2026-04-22 18:01:34.646074167 +0000 UTC m=+476.523343314" observedRunningTime="2026-04-22 18:01:35.327715278 +0000 UTC m=+477.204984460" watchObservedRunningTime="2026-04-22 18:01:35.328790944 +0000 UTC m=+477.206060112" Apr 22 18:01:46.316816 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:01:46.316783 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-d2t89" Apr 22 18:02:29.054473 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:29.054391 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-4nqhm"] Apr 22 18:02:29.056693 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:29.056676 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-4nqhm" Apr 22 18:02:29.059612 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:29.059586 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:02:29.059612 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:29.059590 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:02:29.059808 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:29.059784 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-5c4cw\"" Apr 22 18:02:29.072375 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:29.072349 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-4nqhm"] Apr 22 18:02:29.078609 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:29.078585 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lx86\" (UniqueName: \"kubernetes.io/projected/c122c782-f623-45a9-b000-e3436d9bc99f-kube-api-access-6lx86\") pod \"authorino-operator-7587b89b76-4nqhm\" (UID: \"c122c782-f623-45a9-b000-e3436d9bc99f\") " pod="kuadrant-system/authorino-operator-7587b89b76-4nqhm" Apr 22 18:02:29.179711 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:29.179678 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lx86\" (UniqueName: \"kubernetes.io/projected/c122c782-f623-45a9-b000-e3436d9bc99f-kube-api-access-6lx86\") pod \"authorino-operator-7587b89b76-4nqhm\" (UID: \"c122c782-f623-45a9-b000-e3436d9bc99f\") " pod="kuadrant-system/authorino-operator-7587b89b76-4nqhm" Apr 22 18:02:29.190715 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:29.190685 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lx86\" (UniqueName: \"kubernetes.io/projected/c122c782-f623-45a9-b000-e3436d9bc99f-kube-api-access-6lx86\") pod \"authorino-operator-7587b89b76-4nqhm\" (UID: \"c122c782-f623-45a9-b000-e3436d9bc99f\") " pod="kuadrant-system/authorino-operator-7587b89b76-4nqhm" Apr 22 18:02:29.366847 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:29.366754 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-4nqhm" Apr 22 18:02:29.501325 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:29.501296 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-4nqhm"] Apr 22 18:02:29.504637 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:02:29.504595 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc122c782_f623_45a9_b000_e3436d9bc99f.slice/crio-1bd674201fe36dc5473ebbdde307da22aaaf5f6b76f2509f5156ba7c5ec1d751 WatchSource:0}: Error finding container 1bd674201fe36dc5473ebbdde307da22aaaf5f6b76f2509f5156ba7c5ec1d751: Status 404 returned error can't find the container with id 1bd674201fe36dc5473ebbdde307da22aaaf5f6b76f2509f5156ba7c5ec1d751 Apr 22 18:02:30.488885 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:30.488850 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-4nqhm" event={"ID":"c122c782-f623-45a9-b000-e3436d9bc99f","Type":"ContainerStarted","Data":"1bd674201fe36dc5473ebbdde307da22aaaf5f6b76f2509f5156ba7c5ec1d751"} Apr 22 18:02:32.498898 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:32.498813 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-4nqhm" event={"ID":"c122c782-f623-45a9-b000-e3436d9bc99f","Type":"ContainerStarted","Data":"7a06688337053be8545fbe47f63edb119660d718a7dcb63fe5a92b8048461b9e"} Apr 22 18:02:32.499238 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:32.498961 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-4nqhm" Apr 22 18:02:32.519670 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:32.519599 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-4nqhm" podStartSLOduration=0.840894498 podStartE2EDuration="3.519585459s" podCreationTimestamp="2026-04-22 18:02:29 +0000 UTC" firstStartedPulling="2026-04-22 18:02:29.507119236 +0000 UTC m=+531.384388381" lastFinishedPulling="2026-04-22 18:02:32.185810192 +0000 UTC m=+534.063079342" observedRunningTime="2026-04-22 18:02:32.517410323 +0000 UTC m=+534.394679492" watchObservedRunningTime="2026-04-22 18:02:32.519585459 +0000 UTC m=+534.396854628" Apr 22 18:02:43.504520 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:02:43.504489 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-4nqhm" Apr 22 18:03:14.632219 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:14.632188 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-bmwd4"] Apr 22 18:03:14.636864 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:14.636839 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-bmwd4" Apr 22 18:03:14.640379 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:14.640354 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-bljh8\"" Apr 22 18:03:14.646103 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:14.646076 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-bmwd4"] Apr 22 18:03:14.675012 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:14.674979 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc4hk\" (UniqueName: \"kubernetes.io/projected/f0324cc0-7bdf-48e3-91da-d2c630c886fe-kube-api-access-cc4hk\") pod \"authorino-674b59b84c-bmwd4\" (UID: \"f0324cc0-7bdf-48e3-91da-d2c630c886fe\") " pod="kuadrant-system/authorino-674b59b84c-bmwd4" Apr 22 18:03:14.775655 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:14.775585 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc4hk\" (UniqueName: \"kubernetes.io/projected/f0324cc0-7bdf-48e3-91da-d2c630c886fe-kube-api-access-cc4hk\") pod \"authorino-674b59b84c-bmwd4\" (UID: \"f0324cc0-7bdf-48e3-91da-d2c630c886fe\") " pod="kuadrant-system/authorino-674b59b84c-bmwd4" Apr 22 18:03:14.785297 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:14.785272 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc4hk\" (UniqueName: \"kubernetes.io/projected/f0324cc0-7bdf-48e3-91da-d2c630c886fe-kube-api-access-cc4hk\") pod \"authorino-674b59b84c-bmwd4\" (UID: \"f0324cc0-7bdf-48e3-91da-d2c630c886fe\") " pod="kuadrant-system/authorino-674b59b84c-bmwd4" Apr 22 18:03:14.948583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:14.948548 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-bmwd4" Apr 22 18:03:15.071748 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:15.071721 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-bmwd4"] Apr 22 18:03:15.074337 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:03:15.074307 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0324cc0_7bdf_48e3_91da_d2c630c886fe.slice/crio-3a0b1f3c8dc478be7bfe86797514a4ae7299b770476da51ab3f56af5f7f13752 WatchSource:0}: Error finding container 3a0b1f3c8dc478be7bfe86797514a4ae7299b770476da51ab3f56af5f7f13752: Status 404 returned error can't find the container with id 3a0b1f3c8dc478be7bfe86797514a4ae7299b770476da51ab3f56af5f7f13752 Apr 22 18:03:15.647496 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:15.647462 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-bmwd4" event={"ID":"f0324cc0-7bdf-48e3-91da-d2c630c886fe","Type":"ContainerStarted","Data":"3a0b1f3c8dc478be7bfe86797514a4ae7299b770476da51ab3f56af5f7f13752"} Apr 22 18:03:18.660269 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:18.660228 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-bmwd4" event={"ID":"f0324cc0-7bdf-48e3-91da-d2c630c886fe","Type":"ContainerStarted","Data":"80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82"} Apr 22 18:03:18.675550 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:18.675496 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-bmwd4" podStartSLOduration=2.014092851 podStartE2EDuration="4.675480726s" podCreationTimestamp="2026-04-22 18:03:14 +0000 UTC" firstStartedPulling="2026-04-22 18:03:15.075505922 +0000 UTC m=+576.952775069" lastFinishedPulling="2026-04-22 18:03:17.736893784 +0000 UTC m=+579.614162944" observedRunningTime="2026-04-22 18:03:18.674540303 +0000 UTC m=+580.551809472" watchObservedRunningTime="2026-04-22 18:03:18.675480726 +0000 UTC m=+580.552749895" Apr 22 18:03:19.216639 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:19.216581 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-bmwd4"] Apr 22 18:03:20.666159 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:20.666123 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-bmwd4" podUID="f0324cc0-7bdf-48e3-91da-d2c630c886fe" containerName="authorino" containerID="cri-o://80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82" gracePeriod=30 Apr 22 18:03:20.911032 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:20.910999 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-bmwd4" Apr 22 18:03:20.931404 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:20.931373 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc4hk\" (UniqueName: \"kubernetes.io/projected/f0324cc0-7bdf-48e3-91da-d2c630c886fe-kube-api-access-cc4hk\") pod \"f0324cc0-7bdf-48e3-91da-d2c630c886fe\" (UID: \"f0324cc0-7bdf-48e3-91da-d2c630c886fe\") " Apr 22 18:03:20.933990 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:20.933959 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0324cc0-7bdf-48e3-91da-d2c630c886fe-kube-api-access-cc4hk" (OuterVolumeSpecName: "kube-api-access-cc4hk") pod "f0324cc0-7bdf-48e3-91da-d2c630c886fe" (UID: "f0324cc0-7bdf-48e3-91da-d2c630c886fe"). InnerVolumeSpecName "kube-api-access-cc4hk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:03:21.031999 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:21.031966 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cc4hk\" (UniqueName: \"kubernetes.io/projected/f0324cc0-7bdf-48e3-91da-d2c630c886fe-kube-api-access-cc4hk\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:03:21.670731 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:21.670683 2583 generic.go:358] "Generic (PLEG): container finished" podID="f0324cc0-7bdf-48e3-91da-d2c630c886fe" containerID="80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82" exitCode=0 Apr 22 18:03:21.670731 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:21.670732 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-bmwd4" Apr 22 18:03:21.671229 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:21.670766 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-bmwd4" event={"ID":"f0324cc0-7bdf-48e3-91da-d2c630c886fe","Type":"ContainerDied","Data":"80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82"} Apr 22 18:03:21.671229 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:21.670801 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-bmwd4" event={"ID":"f0324cc0-7bdf-48e3-91da-d2c630c886fe","Type":"ContainerDied","Data":"3a0b1f3c8dc478be7bfe86797514a4ae7299b770476da51ab3f56af5f7f13752"} Apr 22 18:03:21.671229 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:21.670818 2583 scope.go:117] "RemoveContainer" containerID="80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82" Apr 22 18:03:21.679165 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:21.679146 2583 scope.go:117] "RemoveContainer" containerID="80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82" Apr 22 18:03:21.679458 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:03:21.679435 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82\": container with ID starting with 80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82 not found: ID does not exist" containerID="80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82" Apr 22 18:03:21.679547 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:21.679465 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82"} err="failed to get container status \"80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82\": rpc error: code = NotFound desc = could not find container \"80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82\": container with ID starting with 80ed3dcaa7f7a853adee51937d7866d39fcdaa56c588ed1190002552f923fd82 not found: ID does not exist" Apr 22 18:03:21.691311 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:21.691273 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-bmwd4"] Apr 22 18:03:21.698494 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:21.698463 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-bmwd4"] Apr 22 18:03:22.665889 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:22.665853 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0324cc0-7bdf-48e3-91da-d2c630c886fe" path="/var/lib/kubelet/pods/f0324cc0-7bdf-48e3-91da-d2c630c886fe/volumes" Apr 22 18:03:38.584619 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:38.584590 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 18:03:38.585037 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:38.584839 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 18:03:57.511511 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.511440 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-nxhn6"] Apr 22 18:03:57.511893 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.511789 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0324cc0-7bdf-48e3-91da-d2c630c886fe" containerName="authorino" Apr 22 18:03:57.511893 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.511800 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0324cc0-7bdf-48e3-91da-d2c630c886fe" containerName="authorino" Apr 22 18:03:57.511893 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.511868 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0324cc0-7bdf-48e3-91da-d2c630c886fe" containerName="authorino" Apr 22 18:03:57.513846 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.513827 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:03:57.516368 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.516349 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:03:57.517567 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.517547 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-rxwg2\"" Apr 22 18:03:57.517567 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.517558 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:03:57.517715 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.517557 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 18:03:57.524360 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.524343 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-nxhn6"] Apr 22 18:03:57.652513 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.652476 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ca187ba-4e2b-4d67-b139-8d1502d85145-cert\") pod \"kserve-controller-manager-644fd69db4-nxhn6\" (UID: \"3ca187ba-4e2b-4d67-b139-8d1502d85145\") " pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:03:57.652722 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.652557 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz2lr\" (UniqueName: \"kubernetes.io/projected/3ca187ba-4e2b-4d67-b139-8d1502d85145-kube-api-access-hz2lr\") pod \"kserve-controller-manager-644fd69db4-nxhn6\" (UID: \"3ca187ba-4e2b-4d67-b139-8d1502d85145\") " pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:03:57.753352 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.753319 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ca187ba-4e2b-4d67-b139-8d1502d85145-cert\") pod \"kserve-controller-manager-644fd69db4-nxhn6\" (UID: \"3ca187ba-4e2b-4d67-b139-8d1502d85145\") " pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:03:57.753558 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.753381 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz2lr\" (UniqueName: \"kubernetes.io/projected/3ca187ba-4e2b-4d67-b139-8d1502d85145-kube-api-access-hz2lr\") pod \"kserve-controller-manager-644fd69db4-nxhn6\" (UID: \"3ca187ba-4e2b-4d67-b139-8d1502d85145\") " pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:03:57.755833 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.755808 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ca187ba-4e2b-4d67-b139-8d1502d85145-cert\") pod \"kserve-controller-manager-644fd69db4-nxhn6\" (UID: \"3ca187ba-4e2b-4d67-b139-8d1502d85145\") " pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:03:57.762827 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.762763 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz2lr\" (UniqueName: \"kubernetes.io/projected/3ca187ba-4e2b-4d67-b139-8d1502d85145-kube-api-access-hz2lr\") pod \"kserve-controller-manager-644fd69db4-nxhn6\" (UID: \"3ca187ba-4e2b-4d67-b139-8d1502d85145\") " pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:03:57.824554 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.824520 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:03:57.944178 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:57.944149 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-nxhn6"] Apr 22 18:03:57.946721 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:03:57.946695 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ca187ba_4e2b_4d67_b139_8d1502d85145.slice/crio-9f0a108458153f11e1cfff41558df707bfe6cdbfe33af67d478a1ae826e87713 WatchSource:0}: Error finding container 9f0a108458153f11e1cfff41558df707bfe6cdbfe33af67d478a1ae826e87713: Status 404 returned error can't find the container with id 9f0a108458153f11e1cfff41558df707bfe6cdbfe33af67d478a1ae826e87713 Apr 22 18:03:58.339142 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:03:58.339101 2583 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/kserve-controller@sha256:d7079381b5e1d30dae300fcc7b5e362eca60af047733ae33c70d1058f77222cc: reading manifest sha256:9d2c743aaba1130d3dce37478e03e67cabfbdc22c2154f214ea0d68c91f586ed in quay.io/opendatahub/kserve-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image" image="quay.io/opendatahub/kserve-controller@sha256:d7079381b5e1d30dae300fcc7b5e362eca60af047733ae33c70d1058f77222cc" Apr 22 18:03:58.339410 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:03:58.339352 2583 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/opendatahub/kserve-controller@sha256:d7079381b5e1d30dae300fcc7b5e362eca60af047733ae33c70d1058f77222cc,Command:[/manager],Args:[--metrics-addr=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:SECRET_NAME,Value:kserve-webhook-server-cert,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{209715200 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hz2lr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:5,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:5,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kserve-controller-manager-644fd69db4-nxhn6_kserve(3ca187ba-4e2b-4d67-b139-8d1502d85145): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/kserve-controller@sha256:d7079381b5e1d30dae300fcc7b5e362eca60af047733ae33c70d1058f77222cc: reading manifest sha256:9d2c743aaba1130d3dce37478e03e67cabfbdc22c2154f214ea0d68c91f586ed in quay.io/opendatahub/kserve-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 22 18:03:58.340547 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:03:58.340518 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/kserve-controller@sha256:d7079381b5e1d30dae300fcc7b5e362eca60af047733ae33c70d1058f77222cc: reading manifest sha256:9d2c743aaba1130d3dce37478e03e67cabfbdc22c2154f214ea0d68c91f586ed in quay.io/opendatahub/kserve-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image\"" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" podUID="3ca187ba-4e2b-4d67-b139-8d1502d85145" Apr 22 18:03:58.802270 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:03:58.802233 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" event={"ID":"3ca187ba-4e2b-4d67-b139-8d1502d85145","Type":"ContainerStarted","Data":"9f0a108458153f11e1cfff41558df707bfe6cdbfe33af67d478a1ae826e87713"} Apr 22 18:03:58.802965 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:03:58.802934 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/kserve-controller@sha256:d7079381b5e1d30dae300fcc7b5e362eca60af047733ae33c70d1058f77222cc\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/kserve-controller@sha256:d7079381b5e1d30dae300fcc7b5e362eca60af047733ae33c70d1058f77222cc: reading manifest sha256:9d2c743aaba1130d3dce37478e03e67cabfbdc22c2154f214ea0d68c91f586ed in quay.io/opendatahub/kserve-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image\"" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" podUID="3ca187ba-4e2b-4d67-b139-8d1502d85145" Apr 22 18:03:59.805464 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:03:59.805419 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/kserve-controller@sha256:d7079381b5e1d30dae300fcc7b5e362eca60af047733ae33c70d1058f77222cc\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/opendatahub/kserve-controller@sha256:d7079381b5e1d30dae300fcc7b5e362eca60af047733ae33c70d1058f77222cc: reading manifest sha256:9d2c743aaba1130d3dce37478e03e67cabfbdc22c2154f214ea0d68c91f586ed in quay.io/opendatahub/kserve-controller: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image\"" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" podUID="3ca187ba-4e2b-4d67-b139-8d1502d85145" Apr 22 18:04:17.867973 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:17.867939 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" event={"ID":"3ca187ba-4e2b-4d67-b139-8d1502d85145","Type":"ContainerStarted","Data":"1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289"} Apr 22 18:04:17.868366 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:17.868143 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:04:17.888011 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:17.887964 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" podStartSLOduration=1.701403056 podStartE2EDuration="20.887949222s" podCreationTimestamp="2026-04-22 18:03:57 +0000 UTC" firstStartedPulling="2026-04-22 18:03:57.947878685 +0000 UTC m=+619.825147830" lastFinishedPulling="2026-04-22 18:04:17.134424846 +0000 UTC m=+639.011693996" observedRunningTime="2026-04-22 18:04:17.88685479 +0000 UTC m=+639.764123960" watchObservedRunningTime="2026-04-22 18:04:17.887949222 +0000 UTC m=+639.765218389" Apr 22 18:04:48.211472 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.211434 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-nxhn6"] Apr 22 18:04:48.212032 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.211701 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" podUID="3ca187ba-4e2b-4d67-b139-8d1502d85145" containerName="manager" containerID="cri-o://1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289" gracePeriod=10 Apr 22 18:04:48.215366 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.215338 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:04:48.236939 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.236910 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-l8fxs"] Apr 22 18:04:48.240818 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.240799 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" Apr 22 18:04:48.248018 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.247980 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-l8fxs"] Apr 22 18:04:48.306002 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.305970 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frcfb\" (UniqueName: \"kubernetes.io/projected/f101634d-e39a-4227-afbb-ab979ac80dfd-kube-api-access-frcfb\") pod \"kserve-controller-manager-644fd69db4-l8fxs\" (UID: \"f101634d-e39a-4227-afbb-ab979ac80dfd\") " pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" Apr 22 18:04:48.306157 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.306023 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f101634d-e39a-4227-afbb-ab979ac80dfd-cert\") pod \"kserve-controller-manager-644fd69db4-l8fxs\" (UID: \"f101634d-e39a-4227-afbb-ab979ac80dfd\") " pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" Apr 22 18:04:48.407289 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.407255 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f101634d-e39a-4227-afbb-ab979ac80dfd-cert\") pod \"kserve-controller-manager-644fd69db4-l8fxs\" (UID: \"f101634d-e39a-4227-afbb-ab979ac80dfd\") " pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" Apr 22 18:04:48.407448 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.407332 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frcfb\" (UniqueName: \"kubernetes.io/projected/f101634d-e39a-4227-afbb-ab979ac80dfd-kube-api-access-frcfb\") pod \"kserve-controller-manager-644fd69db4-l8fxs\" (UID: \"f101634d-e39a-4227-afbb-ab979ac80dfd\") " pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" Apr 22 18:04:48.409727 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.409701 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f101634d-e39a-4227-afbb-ab979ac80dfd-cert\") pod \"kserve-controller-manager-644fd69db4-l8fxs\" (UID: \"f101634d-e39a-4227-afbb-ab979ac80dfd\") " pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" Apr 22 18:04:48.415932 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.415904 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frcfb\" (UniqueName: \"kubernetes.io/projected/f101634d-e39a-4227-afbb-ab979ac80dfd-kube-api-access-frcfb\") pod \"kserve-controller-manager-644fd69db4-l8fxs\" (UID: \"f101634d-e39a-4227-afbb-ab979ac80dfd\") " pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" Apr 22 18:04:48.448635 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.448601 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:04:48.508554 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.508460 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz2lr\" (UniqueName: \"kubernetes.io/projected/3ca187ba-4e2b-4d67-b139-8d1502d85145-kube-api-access-hz2lr\") pod \"3ca187ba-4e2b-4d67-b139-8d1502d85145\" (UID: \"3ca187ba-4e2b-4d67-b139-8d1502d85145\") " Apr 22 18:04:48.508731 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.508563 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ca187ba-4e2b-4d67-b139-8d1502d85145-cert\") pod \"3ca187ba-4e2b-4d67-b139-8d1502d85145\" (UID: \"3ca187ba-4e2b-4d67-b139-8d1502d85145\") " Apr 22 18:04:48.510694 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.510660 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ca187ba-4e2b-4d67-b139-8d1502d85145-kube-api-access-hz2lr" (OuterVolumeSpecName: "kube-api-access-hz2lr") pod "3ca187ba-4e2b-4d67-b139-8d1502d85145" (UID: "3ca187ba-4e2b-4d67-b139-8d1502d85145"). InnerVolumeSpecName "kube-api-access-hz2lr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:04:48.510796 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.510712 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca187ba-4e2b-4d67-b139-8d1502d85145-cert" (OuterVolumeSpecName: "cert") pod "3ca187ba-4e2b-4d67-b139-8d1502d85145" (UID: "3ca187ba-4e2b-4d67-b139-8d1502d85145"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:04:48.587887 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.587852 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" Apr 22 18:04:48.609316 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.609287 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hz2lr\" (UniqueName: \"kubernetes.io/projected/3ca187ba-4e2b-4d67-b139-8d1502d85145-kube-api-access-hz2lr\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:04:48.609316 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.609314 2583 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ca187ba-4e2b-4d67-b139-8d1502d85145-cert\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:04:48.712882 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.712837 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-l8fxs"] Apr 22 18:04:48.715206 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:04:48.715178 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf101634d_e39a_4227_afbb_ab979ac80dfd.slice/crio-20d2932eb6e513ff62647664fd27d9acbe76785dd536ae801d7fb89e5a4980da WatchSource:0}: Error finding container 20d2932eb6e513ff62647664fd27d9acbe76785dd536ae801d7fb89e5a4980da: Status 404 returned error can't find the container with id 20d2932eb6e513ff62647664fd27d9acbe76785dd536ae801d7fb89e5a4980da Apr 22 18:04:48.716503 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.716486 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:04:48.989567 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.989525 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" event={"ID":"f101634d-e39a-4227-afbb-ab979ac80dfd","Type":"ContainerStarted","Data":"20d2932eb6e513ff62647664fd27d9acbe76785dd536ae801d7fb89e5a4980da"} Apr 22 18:04:48.990506 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.990480 2583 generic.go:358] "Generic (PLEG): container finished" podID="3ca187ba-4e2b-4d67-b139-8d1502d85145" containerID="1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289" exitCode=0 Apr 22 18:04:48.990655 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.990547 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" event={"ID":"3ca187ba-4e2b-4d67-b139-8d1502d85145","Type":"ContainerDied","Data":"1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289"} Apr 22 18:04:48.990655 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.990556 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" Apr 22 18:04:48.990655 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.990579 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-nxhn6" event={"ID":"3ca187ba-4e2b-4d67-b139-8d1502d85145","Type":"ContainerDied","Data":"9f0a108458153f11e1cfff41558df707bfe6cdbfe33af67d478a1ae826e87713"} Apr 22 18:04:48.990655 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.990600 2583 scope.go:117] "RemoveContainer" containerID="1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289" Apr 22 18:04:48.998348 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.998330 2583 scope.go:117] "RemoveContainer" containerID="1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289" Apr 22 18:04:48.998600 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:04:48.998580 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289\": container with ID starting with 1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289 not found: ID does not exist" containerID="1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289" Apr 22 18:04:48.998725 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:48.998614 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289"} err="failed to get container status \"1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289\": rpc error: code = NotFound desc = could not find container \"1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289\": container with ID starting with 1756144ae9657a3094511c3223c5c69e3f122e34f9da8d1d7c98aa4331295289 not found: ID does not exist" Apr 22 18:04:49.006953 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:49.006927 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-nxhn6"] Apr 22 18:04:49.010170 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:49.010144 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-nxhn6"] Apr 22 18:04:49.995586 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:49.995545 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" event={"ID":"f101634d-e39a-4227-afbb-ab979ac80dfd","Type":"ContainerStarted","Data":"4d5f8e18d6b7bc5390ab4c013d9db217d614f28772b12edf14231bdf3de15723"} Apr 22 18:04:49.996079 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:49.995598 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" Apr 22 18:04:50.015180 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:50.015131 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" podStartSLOduration=1.649721233 podStartE2EDuration="2.015116154s" podCreationTimestamp="2026-04-22 18:04:48 +0000 UTC" firstStartedPulling="2026-04-22 18:04:48.716602548 +0000 UTC m=+670.593871694" lastFinishedPulling="2026-04-22 18:04:49.081997465 +0000 UTC m=+670.959266615" observedRunningTime="2026-04-22 18:04:50.014164416 +0000 UTC m=+671.891433607" watchObservedRunningTime="2026-04-22 18:04:50.015116154 +0000 UTC m=+671.892385392" Apr 22 18:04:50.667209 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:04:50.667174 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ca187ba-4e2b-4d67-b139-8d1502d85145" path="/var/lib/kubelet/pods/3ca187ba-4e2b-4d67-b139-8d1502d85145/volumes" Apr 22 18:05:21.004580 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:21.004551 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-644fd69db4-l8fxs" Apr 22 18:05:21.866240 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:21.866202 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-xqmjx"] Apr 22 18:05:21.866673 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:21.866655 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ca187ba-4e2b-4d67-b139-8d1502d85145" containerName="manager" Apr 22 18:05:21.866673 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:21.866672 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca187ba-4e2b-4d67-b139-8d1502d85145" containerName="manager" Apr 22 18:05:21.866812 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:21.866744 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ca187ba-4e2b-4d67-b139-8d1502d85145" containerName="manager" Apr 22 18:05:21.868778 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:21.868763 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-xqmjx" Apr 22 18:05:21.871230 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:21.871206 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:05:21.871822 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:21.871804 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-m7rcr\"" Apr 22 18:05:21.879067 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:21.879047 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-xqmjx"] Apr 22 18:05:21.913135 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:21.913111 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf4cae0f-528e-41b2-9e19-292633c04a9c-cert\") pod \"odh-model-controller-696fc77849-xqmjx\" (UID: \"bf4cae0f-528e-41b2-9e19-292633c04a9c\") " pod="kserve/odh-model-controller-696fc77849-xqmjx" Apr 22 18:05:21.913273 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:21.913157 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfpxk\" (UniqueName: \"kubernetes.io/projected/bf4cae0f-528e-41b2-9e19-292633c04a9c-kube-api-access-dfpxk\") pod \"odh-model-controller-696fc77849-xqmjx\" (UID: \"bf4cae0f-528e-41b2-9e19-292633c04a9c\") " pod="kserve/odh-model-controller-696fc77849-xqmjx" Apr 22 18:05:22.014169 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:22.014140 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf4cae0f-528e-41b2-9e19-292633c04a9c-cert\") pod \"odh-model-controller-696fc77849-xqmjx\" (UID: \"bf4cae0f-528e-41b2-9e19-292633c04a9c\") " pod="kserve/odh-model-controller-696fc77849-xqmjx" Apr 22 18:05:22.014579 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:22.014204 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfpxk\" (UniqueName: \"kubernetes.io/projected/bf4cae0f-528e-41b2-9e19-292633c04a9c-kube-api-access-dfpxk\") pod \"odh-model-controller-696fc77849-xqmjx\" (UID: \"bf4cae0f-528e-41b2-9e19-292633c04a9c\") " pod="kserve/odh-model-controller-696fc77849-xqmjx" Apr 22 18:05:22.014579 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:05:22.014276 2583 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:05:22.014579 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:05:22.014345 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf4cae0f-528e-41b2-9e19-292633c04a9c-cert podName:bf4cae0f-528e-41b2-9e19-292633c04a9c nodeName:}" failed. No retries permitted until 2026-04-22 18:05:22.514328771 +0000 UTC m=+704.391597917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf4cae0f-528e-41b2-9e19-292633c04a9c-cert") pod "odh-model-controller-696fc77849-xqmjx" (UID: "bf4cae0f-528e-41b2-9e19-292633c04a9c") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:05:22.022997 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:22.022971 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfpxk\" (UniqueName: \"kubernetes.io/projected/bf4cae0f-528e-41b2-9e19-292633c04a9c-kube-api-access-dfpxk\") pod \"odh-model-controller-696fc77849-xqmjx\" (UID: \"bf4cae0f-528e-41b2-9e19-292633c04a9c\") " pod="kserve/odh-model-controller-696fc77849-xqmjx" Apr 22 18:05:22.517768 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:22.517739 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf4cae0f-528e-41b2-9e19-292633c04a9c-cert\") pod \"odh-model-controller-696fc77849-xqmjx\" (UID: \"bf4cae0f-528e-41b2-9e19-292633c04a9c\") " pod="kserve/odh-model-controller-696fc77849-xqmjx" Apr 22 18:05:22.520089 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:22.520071 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf4cae0f-528e-41b2-9e19-292633c04a9c-cert\") pod \"odh-model-controller-696fc77849-xqmjx\" (UID: \"bf4cae0f-528e-41b2-9e19-292633c04a9c\") " pod="kserve/odh-model-controller-696fc77849-xqmjx" Apr 22 18:05:22.780723 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:22.780640 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-xqmjx" Apr 22 18:05:22.915418 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:22.915379 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-xqmjx"] Apr 22 18:05:22.919916 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:05:22.919887 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf4cae0f_528e_41b2_9e19_292633c04a9c.slice/crio-c346d37612fffaa5bf71e61e2d3dba155683f418cb2cc51e4c28a5d103d2c23b WatchSource:0}: Error finding container c346d37612fffaa5bf71e61e2d3dba155683f418cb2cc51e4c28a5d103d2c23b: Status 404 returned error can't find the container with id c346d37612fffaa5bf71e61e2d3dba155683f418cb2cc51e4c28a5d103d2c23b Apr 22 18:05:23.106518 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:23.106431 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-xqmjx" event={"ID":"bf4cae0f-528e-41b2-9e19-292633c04a9c","Type":"ContainerStarted","Data":"c346d37612fffaa5bf71e61e2d3dba155683f418cb2cc51e4c28a5d103d2c23b"} Apr 22 18:05:26.119201 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:26.119167 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-xqmjx" event={"ID":"bf4cae0f-528e-41b2-9e19-292633c04a9c","Type":"ContainerStarted","Data":"3d6e49da1fbd27437c07022dd6a7ad10a1e492f23d4c22b283a1ea2a1273e0e0"} Apr 22 18:05:26.119690 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:26.119315 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-xqmjx" Apr 22 18:05:26.137050 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:26.137007 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-xqmjx" podStartSLOduration=2.630833078 podStartE2EDuration="5.136992599s" podCreationTimestamp="2026-04-22 18:05:21 +0000 UTC" firstStartedPulling="2026-04-22 18:05:22.921125017 +0000 UTC m=+704.798394163" lastFinishedPulling="2026-04-22 18:05:25.427284538 +0000 UTC m=+707.304553684" observedRunningTime="2026-04-22 18:05:26.135135425 +0000 UTC m=+708.012404591" watchObservedRunningTime="2026-04-22 18:05:26.136992599 +0000 UTC m=+708.014261767" Apr 22 18:05:37.125238 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:37.125208 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-xqmjx" Apr 22 18:05:37.976469 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:37.976437 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-hqn8x"] Apr 22 18:05:37.978962 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:37.978947 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hqn8x" Apr 22 18:05:37.982888 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:37.982865 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-sqzx5\"" Apr 22 18:05:37.983121 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:37.983103 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:05:37.989396 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:37.989378 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-hqn8x"] Apr 22 18:05:38.062007 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:38.061970 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpqms\" (UniqueName: \"kubernetes.io/projected/d6d9ddca-82ef-4030-be0f-332fbc6fcb61-kube-api-access-bpqms\") pod \"s3-init-hqn8x\" (UID: \"d6d9ddca-82ef-4030-be0f-332fbc6fcb61\") " pod="kserve/s3-init-hqn8x" Apr 22 18:05:38.162586 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:38.162551 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpqms\" (UniqueName: \"kubernetes.io/projected/d6d9ddca-82ef-4030-be0f-332fbc6fcb61-kube-api-access-bpqms\") pod \"s3-init-hqn8x\" (UID: \"d6d9ddca-82ef-4030-be0f-332fbc6fcb61\") " pod="kserve/s3-init-hqn8x" Apr 22 18:05:38.171521 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:38.171493 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpqms\" (UniqueName: \"kubernetes.io/projected/d6d9ddca-82ef-4030-be0f-332fbc6fcb61-kube-api-access-bpqms\") pod \"s3-init-hqn8x\" (UID: \"d6d9ddca-82ef-4030-be0f-332fbc6fcb61\") " pod="kserve/s3-init-hqn8x" Apr 22 18:05:38.287828 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:38.287728 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hqn8x" Apr 22 18:05:38.414000 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:38.413840 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-hqn8x"] Apr 22 18:05:38.416269 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:05:38.416227 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6d9ddca_82ef_4030_be0f_332fbc6fcb61.slice/crio-4513d48f3e2870b502cdce7b30a554033c488c93663425c50158d9475050c909 WatchSource:0}: Error finding container 4513d48f3e2870b502cdce7b30a554033c488c93663425c50158d9475050c909: Status 404 returned error can't find the container with id 4513d48f3e2870b502cdce7b30a554033c488c93663425c50158d9475050c909 Apr 22 18:05:39.167740 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:39.167698 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hqn8x" event={"ID":"d6d9ddca-82ef-4030-be0f-332fbc6fcb61","Type":"ContainerStarted","Data":"4513d48f3e2870b502cdce7b30a554033c488c93663425c50158d9475050c909"} Apr 22 18:05:42.862554 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:42.862530 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:05:43.186909 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:43.186870 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hqn8x" event={"ID":"d6d9ddca-82ef-4030-be0f-332fbc6fcb61","Type":"ContainerStarted","Data":"16523aab664f6efab36126b269704336ef86fcacff0a9c4ed79984b19a91741c"} Apr 22 18:05:43.202734 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:43.202684 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-hqn8x" podStartSLOduration=1.7610587789999999 podStartE2EDuration="6.202666143s" podCreationTimestamp="2026-04-22 18:05:37 +0000 UTC" firstStartedPulling="2026-04-22 18:05:38.418076733 +0000 UTC m=+720.295345880" lastFinishedPulling="2026-04-22 18:05:42.859684092 +0000 UTC m=+724.736953244" observedRunningTime="2026-04-22 18:05:43.2019332 +0000 UTC m=+725.079202369" watchObservedRunningTime="2026-04-22 18:05:43.202666143 +0000 UTC m=+725.079935313" Apr 22 18:05:46.197601 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:46.197568 2583 generic.go:358] "Generic (PLEG): container finished" podID="d6d9ddca-82ef-4030-be0f-332fbc6fcb61" containerID="16523aab664f6efab36126b269704336ef86fcacff0a9c4ed79984b19a91741c" exitCode=0 Apr 22 18:05:46.197988 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:46.197661 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hqn8x" event={"ID":"d6d9ddca-82ef-4030-be0f-332fbc6fcb61","Type":"ContainerDied","Data":"16523aab664f6efab36126b269704336ef86fcacff0a9c4ed79984b19a91741c"} Apr 22 18:05:47.341378 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:47.341352 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hqn8x" Apr 22 18:05:47.459172 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:47.459086 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpqms\" (UniqueName: \"kubernetes.io/projected/d6d9ddca-82ef-4030-be0f-332fbc6fcb61-kube-api-access-bpqms\") pod \"d6d9ddca-82ef-4030-be0f-332fbc6fcb61\" (UID: \"d6d9ddca-82ef-4030-be0f-332fbc6fcb61\") " Apr 22 18:05:47.461254 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:47.461222 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d9ddca-82ef-4030-be0f-332fbc6fcb61-kube-api-access-bpqms" (OuterVolumeSpecName: "kube-api-access-bpqms") pod "d6d9ddca-82ef-4030-be0f-332fbc6fcb61" (UID: "d6d9ddca-82ef-4030-be0f-332fbc6fcb61"). InnerVolumeSpecName "kube-api-access-bpqms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:05:47.560920 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:47.560886 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpqms\" (UniqueName: \"kubernetes.io/projected/d6d9ddca-82ef-4030-be0f-332fbc6fcb61-kube-api-access-bpqms\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:05:48.205291 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:48.205258 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hqn8x" Apr 22 18:05:48.205291 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:48.205266 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hqn8x" event={"ID":"d6d9ddca-82ef-4030-be0f-332fbc6fcb61","Type":"ContainerDied","Data":"4513d48f3e2870b502cdce7b30a554033c488c93663425c50158d9475050c909"} Apr 22 18:05:48.205291 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:05:48.205294 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4513d48f3e2870b502cdce7b30a554033c488c93663425c50158d9475050c909" Apr 22 18:06:06.700214 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.700186 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb"] Apr 22 18:06:06.700726 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.700710 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6d9ddca-82ef-4030-be0f-332fbc6fcb61" containerName="s3-init" Apr 22 18:06:06.700777 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.700729 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d9ddca-82ef-4030-be0f-332fbc6fcb61" containerName="s3-init" Apr 22 18:06:06.700811 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.700794 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6d9ddca-82ef-4030-be0f-332fbc6fcb61" containerName="s3-init" Apr 22 18:06:06.749697 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.749659 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb"] Apr 22 18:06:06.749857 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.749794 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.753066 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.753043 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:06:06.754372 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.754349 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:06:06.754483 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.754355 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 18:06:06.754483 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.754377 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:06:06.820528 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.820489 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2bgg\" (UniqueName: \"kubernetes.io/projected/b0f54523-3245-426a-8e6e-f8f6445dbc46-kube-api-access-w2bgg\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.820528 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.820533 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-dshm\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.820809 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.820644 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-home\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.820809 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.820674 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.820809 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.820710 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.820809 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.820732 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f54523-3245-426a-8e6e-f8f6445dbc46-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.921317 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.921277 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2bgg\" (UniqueName: \"kubernetes.io/projected/b0f54523-3245-426a-8e6e-f8f6445dbc46-kube-api-access-w2bgg\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.921317 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.921321 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-dshm\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.921572 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.921372 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-home\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.921572 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.921387 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.921572 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.921410 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.921572 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.921425 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f54523-3245-426a-8e6e-f8f6445dbc46-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.921878 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.921853 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-home\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.921943 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.921884 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.921943 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.921933 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.923657 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.923618 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-dshm\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.923940 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.923922 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f54523-3245-426a-8e6e-f8f6445dbc46-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:06.931502 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:06.931479 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2bgg\" (UniqueName: \"kubernetes.io/projected/b0f54523-3245-426a-8e6e-f8f6445dbc46-kube-api-access-w2bgg\") pod \"scheduler-configmap-ref-test-kserve-5857f646f7-b8frb\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:07.062956 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:07.062860 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:07.207070 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:07.207022 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb"] Apr 22 18:06:07.209473 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:06:07.209436 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0f54523_3245_426a_8e6e_f8f6445dbc46.slice/crio-453e5e428d86176adaab7659e0bf3b1caba0fd0c8d6d3ec6be3285563f058e6a WatchSource:0}: Error finding container 453e5e428d86176adaab7659e0bf3b1caba0fd0c8d6d3ec6be3285563f058e6a: Status 404 returned error can't find the container with id 453e5e428d86176adaab7659e0bf3b1caba0fd0c8d6d3ec6be3285563f058e6a Apr 22 18:06:07.275065 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:07.275024 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" event={"ID":"b0f54523-3245-426a-8e6e-f8f6445dbc46","Type":"ContainerStarted","Data":"453e5e428d86176adaab7659e0bf3b1caba0fd0c8d6d3ec6be3285563f058e6a"} Apr 22 18:06:11.289935 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:11.289895 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" event={"ID":"b0f54523-3245-426a-8e6e-f8f6445dbc46","Type":"ContainerStarted","Data":"b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6"} Apr 22 18:06:16.308704 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:16.308666 2583 generic.go:358] "Generic (PLEG): container finished" podID="b0f54523-3245-426a-8e6e-f8f6445dbc46" containerID="b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6" exitCode=0 Apr 22 18:06:16.309083 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:16.308737 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" event={"ID":"b0f54523-3245-426a-8e6e-f8f6445dbc46","Type":"ContainerDied","Data":"b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6"} Apr 22 18:06:18.318698 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:18.318660 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" event={"ID":"b0f54523-3245-426a-8e6e-f8f6445dbc46","Type":"ContainerStarted","Data":"0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703"} Apr 22 18:06:18.338906 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:18.338856 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" podStartSLOduration=2.087014264 podStartE2EDuration="12.338840339s" podCreationTimestamp="2026-04-22 18:06:06 +0000 UTC" firstStartedPulling="2026-04-22 18:06:07.211277751 +0000 UTC m=+749.088546898" lastFinishedPulling="2026-04-22 18:06:17.463103815 +0000 UTC m=+759.340372973" observedRunningTime="2026-04-22 18:06:18.336545855 +0000 UTC m=+760.213815036" watchObservedRunningTime="2026-04-22 18:06:18.338840339 +0000 UTC m=+760.216109508" Apr 22 18:06:27.063675 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:27.063621 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:27.064138 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:27.063687 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:27.076159 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:27.076134 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:27.360437 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:27.360358 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:58.628847 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.628770 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb"] Apr 22 18:06:58.629318 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.629039 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" podUID="b0f54523-3245-426a-8e6e-f8f6445dbc46" containerName="main" containerID="cri-o://0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703" gracePeriod=30 Apr 22 18:06:58.912211 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.912187 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:58.988043 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.988006 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f54523-3245-426a-8e6e-f8f6445dbc46-tls-certs\") pod \"b0f54523-3245-426a-8e6e-f8f6445dbc46\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " Apr 22 18:06:58.988234 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.988060 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-kserve-provision-location\") pod \"b0f54523-3245-426a-8e6e-f8f6445dbc46\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " Apr 22 18:06:58.988234 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.988088 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2bgg\" (UniqueName: \"kubernetes.io/projected/b0f54523-3245-426a-8e6e-f8f6445dbc46-kube-api-access-w2bgg\") pod \"b0f54523-3245-426a-8e6e-f8f6445dbc46\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " Apr 22 18:06:58.988234 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.988103 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-home\") pod \"b0f54523-3245-426a-8e6e-f8f6445dbc46\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " Apr 22 18:06:58.988412 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.988239 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-model-cache\") pod \"b0f54523-3245-426a-8e6e-f8f6445dbc46\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " Apr 22 18:06:58.988412 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.988281 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-dshm\") pod \"b0f54523-3245-426a-8e6e-f8f6445dbc46\" (UID: \"b0f54523-3245-426a-8e6e-f8f6445dbc46\") " Apr 22 18:06:58.988412 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.988340 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-home" (OuterVolumeSpecName: "home") pod "b0f54523-3245-426a-8e6e-f8f6445dbc46" (UID: "b0f54523-3245-426a-8e6e-f8f6445dbc46"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:06:58.988567 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.988508 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-model-cache" (OuterVolumeSpecName: "model-cache") pod "b0f54523-3245-426a-8e6e-f8f6445dbc46" (UID: "b0f54523-3245-426a-8e6e-f8f6445dbc46"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:06:58.988675 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.988650 2583 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-home\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:06:58.988793 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.988678 2583 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-model-cache\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:06:58.990358 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.990335 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f54523-3245-426a-8e6e-f8f6445dbc46-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b0f54523-3245-426a-8e6e-f8f6445dbc46" (UID: "b0f54523-3245-426a-8e6e-f8f6445dbc46"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:06:58.990564 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.990546 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f54523-3245-426a-8e6e-f8f6445dbc46-kube-api-access-w2bgg" (OuterVolumeSpecName: "kube-api-access-w2bgg") pod "b0f54523-3245-426a-8e6e-f8f6445dbc46" (UID: "b0f54523-3245-426a-8e6e-f8f6445dbc46"). InnerVolumeSpecName "kube-api-access-w2bgg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:06:58.990690 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:58.990560 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-dshm" (OuterVolumeSpecName: "dshm") pod "b0f54523-3245-426a-8e6e-f8f6445dbc46" (UID: "b0f54523-3245-426a-8e6e-f8f6445dbc46"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:06:59.042324 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.042273 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0f54523-3245-426a-8e6e-f8f6445dbc46" (UID: "b0f54523-3245-426a-8e6e-f8f6445dbc46"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:06:59.089606 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.089570 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-kserve-provision-location\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:06:59.089606 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.089602 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w2bgg\" (UniqueName: \"kubernetes.io/projected/b0f54523-3245-426a-8e6e-f8f6445dbc46-kube-api-access-w2bgg\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:06:59.089606 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.089612 2583 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0f54523-3245-426a-8e6e-f8f6445dbc46-dshm\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:06:59.089839 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.089634 2583 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f54523-3245-426a-8e6e-f8f6445dbc46-tls-certs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:06:59.460403 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.460367 2583 generic.go:358] "Generic (PLEG): container finished" podID="b0f54523-3245-426a-8e6e-f8f6445dbc46" containerID="0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703" exitCode=0 Apr 22 18:06:59.460585 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.460403 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" event={"ID":"b0f54523-3245-426a-8e6e-f8f6445dbc46","Type":"ContainerDied","Data":"0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703"} Apr 22 18:06:59.460585 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.460445 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" event={"ID":"b0f54523-3245-426a-8e6e-f8f6445dbc46","Type":"ContainerDied","Data":"453e5e428d86176adaab7659e0bf3b1caba0fd0c8d6d3ec6be3285563f058e6a"} Apr 22 18:06:59.460585 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.460449 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb" Apr 22 18:06:59.460585 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.460460 2583 scope.go:117] "RemoveContainer" containerID="0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703" Apr 22 18:06:59.469332 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.469315 2583 scope.go:117] "RemoveContainer" containerID="b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6" Apr 22 18:06:59.482388 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.482359 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb"] Apr 22 18:06:59.485249 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.485226 2583 scope.go:117] "RemoveContainer" containerID="0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703" Apr 22 18:06:59.485575 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:06:59.485547 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703\": container with ID starting with 0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703 not found: ID does not exist" containerID="0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703" Apr 22 18:06:59.485720 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.485585 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703"} err="failed to get container status \"0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703\": rpc error: code = NotFound desc = could not find container \"0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703\": container with ID starting with 0cc7106d8a005fd81b3b876983ea09a39b5852c531c754cacac450b363af5703 not found: ID does not exist" Apr 22 18:06:59.485720 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.485610 2583 scope.go:117] "RemoveContainer" containerID="b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6" Apr 22 18:06:59.485914 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:06:59.485894 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6\": container with ID starting with b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6 not found: ID does not exist" containerID="b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6" Apr 22 18:06:59.485947 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.485922 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6"} err="failed to get container status \"b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6\": rpc error: code = NotFound desc = could not find container \"b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6\": container with ID starting with b8be601db1ca13a3257126a8f554bca7673efd167839041f87d48ad40dc937e6 not found: ID does not exist" Apr 22 18:06:59.486531 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:06:59.486516 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5857f646f7-b8frb"] Apr 22 18:07:00.670903 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:00.670809 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f54523-3245-426a-8e6e-f8f6445dbc46" path="/var/lib/kubelet/pods/b0f54523-3245-426a-8e6e-f8f6445dbc46/volumes" Apr 22 18:07:17.706392 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.706328 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv"] Apr 22 18:07:17.706936 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.706889 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0f54523-3245-426a-8e6e-f8f6445dbc46" containerName="storage-initializer" Apr 22 18:07:17.707026 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.706939 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f54523-3245-426a-8e6e-f8f6445dbc46" containerName="storage-initializer" Apr 22 18:07:17.707026 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.706954 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0f54523-3245-426a-8e6e-f8f6445dbc46" containerName="main" Apr 22 18:07:17.707026 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.706962 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f54523-3245-426a-8e6e-f8f6445dbc46" containerName="main" Apr 22 18:07:17.707170 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.707135 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0f54523-3245-426a-8e6e-f8f6445dbc46" containerName="main" Apr 22 18:07:17.710852 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.710833 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.713494 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.713470 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:07:17.713586 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.713528 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:07:17.714662 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.714643 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:07:17.714753 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.714668 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 18:07:17.723342 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.723317 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv"] Apr 22 18:07:17.858370 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.858329 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-home\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.858550 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.858414 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7rgj\" (UniqueName: \"kubernetes.io/projected/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kube-api-access-m7rgj\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.858550 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.858443 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.858550 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.858474 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.858550 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.858535 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.858722 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.858601 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-dshm\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.931661 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.931613 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c"] Apr 22 18:07:17.934521 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.934371 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:17.937290 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.937259 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-g295z\"" Apr 22 18:07:17.944874 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.944849 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c"] Apr 22 18:07:17.959366 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.959302 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-home\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.959482 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.959378 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7rgj\" (UniqueName: \"kubernetes.io/projected/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kube-api-access-m7rgj\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.959482 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.959409 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.959482 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.959435 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.959482 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.959453 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.959715 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.959521 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-dshm\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.959782 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.959738 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-home\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.959848 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.959810 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-model-cache\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.959954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.959934 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.961735 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.961709 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-dshm\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.961970 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.961954 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:17.974409 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:17.974384 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7rgj\" (UniqueName: \"kubernetes.io/projected/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kube-api-access-m7rgj\") pod \"scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:18.021145 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.021108 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:18.060411 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.060372 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtk52\" (UniqueName: \"kubernetes.io/projected/767123c8-e6d2-4986-9b3f-6331624aea7d-kube-api-access-rtk52\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.060578 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.060440 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.060578 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.060530 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.060677 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.060585 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/767123c8-e6d2-4986-9b3f-6331624aea7d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.060677 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.060657 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.060761 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.060690 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.143402 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.143376 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv"] Apr 22 18:07:18.145444 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:07:18.145408 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab79122_94e7_436e_b24a_b2fb4c15f1b8.slice/crio-c118bc367d2aac8c7b67224e12e3e8291f5547065f2550f6ba5d4c39290f3ed0 WatchSource:0}: Error finding container c118bc367d2aac8c7b67224e12e3e8291f5547065f2550f6ba5d4c39290f3ed0: Status 404 returned error can't find the container with id c118bc367d2aac8c7b67224e12e3e8291f5547065f2550f6ba5d4c39290f3ed0 Apr 22 18:07:18.161290 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.161265 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtk52\" (UniqueName: \"kubernetes.io/projected/767123c8-e6d2-4986-9b3f-6331624aea7d-kube-api-access-rtk52\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.161377 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.161324 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.161433 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.161374 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.161433 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.161398 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/767123c8-e6d2-4986-9b3f-6331624aea7d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.161433 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.161423 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.161574 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.161457 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.161791 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.161763 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.161906 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.161861 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.161906 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.161870 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.162105 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.162086 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.164029 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.164005 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/767123c8-e6d2-4986-9b3f-6331624aea7d-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.169008 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.168987 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtk52\" (UniqueName: \"kubernetes.io/projected/767123c8-e6d2-4986-9b3f-6331624aea7d-kube-api-access-rtk52\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.245829 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.245792 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:18.374128 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.374103 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c"] Apr 22 18:07:18.376067 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:07:18.376039 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod767123c8_e6d2_4986_9b3f_6331624aea7d.slice/crio-ca6aafa9664b88692d578ec9b8557534934a1da09936b8dcfbf47695233f3d0f WatchSource:0}: Error finding container ca6aafa9664b88692d578ec9b8557534934a1da09936b8dcfbf47695233f3d0f: Status 404 returned error can't find the container with id ca6aafa9664b88692d578ec9b8557534934a1da09936b8dcfbf47695233f3d0f Apr 22 18:07:18.532393 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.532282 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" event={"ID":"767123c8-e6d2-4986-9b3f-6331624aea7d","Type":"ContainerStarted","Data":"6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0"} Apr 22 18:07:18.532393 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.532334 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" event={"ID":"767123c8-e6d2-4986-9b3f-6331624aea7d","Type":"ContainerStarted","Data":"ca6aafa9664b88692d578ec9b8557534934a1da09936b8dcfbf47695233f3d0f"} Apr 22 18:07:18.533740 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.533715 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" event={"ID":"2ab79122-94e7-436e-b24a-b2fb4c15f1b8","Type":"ContainerStarted","Data":"fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d"} Apr 22 18:07:18.533848 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:18.533744 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" event={"ID":"2ab79122-94e7-436e-b24a-b2fb4c15f1b8","Type":"ContainerStarted","Data":"c118bc367d2aac8c7b67224e12e3e8291f5547065f2550f6ba5d4c39290f3ed0"} Apr 22 18:07:19.539913 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:19.539871 2583 generic.go:358] "Generic (PLEG): container finished" podID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerID="6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0" exitCode=0 Apr 22 18:07:19.540444 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:19.539935 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" event={"ID":"767123c8-e6d2-4986-9b3f-6331624aea7d","Type":"ContainerDied","Data":"6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0"} Apr 22 18:07:21.549793 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:21.549615 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" event={"ID":"767123c8-e6d2-4986-9b3f-6331624aea7d","Type":"ContainerStarted","Data":"cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c"} Apr 22 18:07:23.566319 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:23.566279 2583 generic.go:358] "Generic (PLEG): container finished" podID="2ab79122-94e7-436e-b24a-b2fb4c15f1b8" containerID="fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d" exitCode=0 Apr 22 18:07:23.566882 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:23.566350 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" event={"ID":"2ab79122-94e7-436e-b24a-b2fb4c15f1b8","Type":"ContainerDied","Data":"fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d"} Apr 22 18:07:24.572614 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:24.572574 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" event={"ID":"2ab79122-94e7-436e-b24a-b2fb4c15f1b8","Type":"ContainerStarted","Data":"f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5"} Apr 22 18:07:24.592139 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:24.592090 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" podStartSLOduration=7.592075142 podStartE2EDuration="7.592075142s" podCreationTimestamp="2026-04-22 18:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:07:24.591041706 +0000 UTC m=+826.468310911" watchObservedRunningTime="2026-04-22 18:07:24.592075142 +0000 UTC m=+826.469344309" Apr 22 18:07:28.021681 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:28.021642 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:28.021681 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:28.021691 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:28.040306 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:28.040279 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:28.604112 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:28.604082 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:07:50.673969 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:50.673934 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" event={"ID":"767123c8-e6d2-4986-9b3f-6331624aea7d","Type":"ContainerStarted","Data":"977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345"} Apr 22 18:07:50.674425 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:50.674196 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:50.676604 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:50.676564 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 18:07:50.698146 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:50.698088 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" podStartSLOduration=3.162056651 podStartE2EDuration="33.698071504s" podCreationTimestamp="2026-04-22 18:07:17 +0000 UTC" firstStartedPulling="2026-04-22 18:07:19.541311674 +0000 UTC m=+821.418580821" lastFinishedPulling="2026-04-22 18:07:50.077326528 +0000 UTC m=+851.954595674" observedRunningTime="2026-04-22 18:07:50.694312886 +0000 UTC m=+852.571582076" watchObservedRunningTime="2026-04-22 18:07:50.698071504 +0000 UTC m=+852.575340673" Apr 22 18:07:51.679072 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:51.679035 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 18:07:58.246148 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:58.246110 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:58.246686 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:58.246163 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:58.247715 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:58.247667 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 18:07:58.247890 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:58.247873 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:58.703161 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:58.703128 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 18:07:58.703328 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:58.703259 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:07:59.705995 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:07:59.705960 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 18:08:09.706742 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:09.706698 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 18:08:12.276538 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:12.276503 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv"] Apr 22 18:08:12.276957 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:12.276793 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" podUID="2ab79122-94e7-436e-b24a-b2fb4c15f1b8" containerName="main" containerID="cri-o://f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5" gracePeriod=30 Apr 22 18:08:12.283230 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:12.283202 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c"] Apr 22 18:08:12.283620 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:12.283590 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="main" containerID="cri-o://cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c" gracePeriod=30 Apr 22 18:08:12.283865 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:12.283609 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="tokenizer" containerID="cri-o://977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345" gracePeriod=30 Apr 22 18:08:12.285401 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:12.285338 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 18:08:13.673408 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.673362 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:08:13.676920 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.676899 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:08:13.700059 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700033 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-home\") pod \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " Apr 22 18:08:13.700221 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700079 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-model-cache\") pod \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " Apr 22 18:08:13.700221 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700123 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kserve-provision-location\") pod \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " Apr 22 18:08:13.700221 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700150 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtk52\" (UniqueName: \"kubernetes.io/projected/767123c8-e6d2-4986-9b3f-6331624aea7d-kube-api-access-rtk52\") pod \"767123c8-e6d2-4986-9b3f-6331624aea7d\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " Apr 22 18:08:13.700221 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700174 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-kserve-provision-location\") pod \"767123c8-e6d2-4986-9b3f-6331624aea7d\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " Apr 22 18:08:13.700221 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700197 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/767123c8-e6d2-4986-9b3f-6331624aea7d-tls-certs\") pod \"767123c8-e6d2-4986-9b3f-6331624aea7d\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " Apr 22 18:08:13.700477 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700334 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-home" (OuterVolumeSpecName: "home") pod "2ab79122-94e7-436e-b24a-b2fb4c15f1b8" (UID: "2ab79122-94e7-436e-b24a-b2fb4c15f1b8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.700477 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700443 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-model-cache" (OuterVolumeSpecName: "model-cache") pod "2ab79122-94e7-436e-b24a-b2fb4c15f1b8" (UID: "2ab79122-94e7-436e-b24a-b2fb4c15f1b8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.700477 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700473 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-cache\") pod \"767123c8-e6d2-4986-9b3f-6331624aea7d\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " Apr 22 18:08:13.700648 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700511 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-dshm\") pod \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " Apr 22 18:08:13.700648 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700598 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7rgj\" (UniqueName: \"kubernetes.io/projected/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kube-api-access-m7rgj\") pod \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " Apr 22 18:08:13.700754 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700653 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-uds\") pod \"767123c8-e6d2-4986-9b3f-6331624aea7d\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " Apr 22 18:08:13.700754 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700690 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-tls-certs\") pod \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\" (UID: \"2ab79122-94e7-436e-b24a-b2fb4c15f1b8\") " Apr 22 18:08:13.700754 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700720 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-tmp\") pod \"767123c8-e6d2-4986-9b3f-6331624aea7d\" (UID: \"767123c8-e6d2-4986-9b3f-6331624aea7d\") " Apr 22 18:08:13.700893 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.700790 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "767123c8-e6d2-4986-9b3f-6331624aea7d" (UID: "767123c8-e6d2-4986-9b3f-6331624aea7d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.701065 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.701046 2583 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-home\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.701151 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.701071 2583 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-model-cache\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.701151 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.701088 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-cache\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.701151 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.701090 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "767123c8-e6d2-4986-9b3f-6331624aea7d" (UID: "767123c8-e6d2-4986-9b3f-6331624aea7d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.701583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.701298 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "767123c8-e6d2-4986-9b3f-6331624aea7d" (UID: "767123c8-e6d2-4986-9b3f-6331624aea7d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.701583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.701446 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "767123c8-e6d2-4986-9b3f-6331624aea7d" (UID: "767123c8-e6d2-4986-9b3f-6331624aea7d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.703284 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.703248 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/767123c8-e6d2-4986-9b3f-6331624aea7d-kube-api-access-rtk52" (OuterVolumeSpecName: "kube-api-access-rtk52") pod "767123c8-e6d2-4986-9b3f-6331624aea7d" (UID: "767123c8-e6d2-4986-9b3f-6331624aea7d"). InnerVolumeSpecName "kube-api-access-rtk52". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:08:13.703447 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.703402 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/767123c8-e6d2-4986-9b3f-6331624aea7d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "767123c8-e6d2-4986-9b3f-6331624aea7d" (UID: "767123c8-e6d2-4986-9b3f-6331624aea7d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:08:13.703447 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.703402 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2ab79122-94e7-436e-b24a-b2fb4c15f1b8" (UID: "2ab79122-94e7-436e-b24a-b2fb4c15f1b8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:08:13.703571 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.703478 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-dshm" (OuterVolumeSpecName: "dshm") pod "2ab79122-94e7-436e-b24a-b2fb4c15f1b8" (UID: "2ab79122-94e7-436e-b24a-b2fb4c15f1b8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.703856 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.703830 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kube-api-access-m7rgj" (OuterVolumeSpecName: "kube-api-access-m7rgj") pod "2ab79122-94e7-436e-b24a-b2fb4c15f1b8" (UID: "2ab79122-94e7-436e-b24a-b2fb4c15f1b8"). InnerVolumeSpecName "kube-api-access-m7rgj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:08:13.757758 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.757722 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2ab79122-94e7-436e-b24a-b2fb4c15f1b8" (UID: "2ab79122-94e7-436e-b24a-b2fb4c15f1b8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:13.757992 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.757971 2583 generic.go:358] "Generic (PLEG): container finished" podID="2ab79122-94e7-436e-b24a-b2fb4c15f1b8" containerID="f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5" exitCode=0 Apr 22 18:08:13.758066 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.758050 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" Apr 22 18:08:13.758066 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.758054 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" event={"ID":"2ab79122-94e7-436e-b24a-b2fb4c15f1b8","Type":"ContainerDied","Data":"f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5"} Apr 22 18:08:13.758156 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.758088 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv" event={"ID":"2ab79122-94e7-436e-b24a-b2fb4c15f1b8","Type":"ContainerDied","Data":"c118bc367d2aac8c7b67224e12e3e8291f5547065f2550f6ba5d4c39290f3ed0"} Apr 22 18:08:13.758156 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.758103 2583 scope.go:117] "RemoveContainer" containerID="f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5" Apr 22 18:08:13.760008 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.759956 2583 generic.go:358] "Generic (PLEG): container finished" podID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerID="977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345" exitCode=0 Apr 22 18:08:13.760008 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.759982 2583 generic.go:358] "Generic (PLEG): container finished" podID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerID="cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c" exitCode=0 Apr 22 18:08:13.760213 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.760016 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" event={"ID":"767123c8-e6d2-4986-9b3f-6331624aea7d","Type":"ContainerDied","Data":"977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345"} Apr 22 18:08:13.760213 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.760033 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" Apr 22 18:08:13.760213 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.760041 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" event={"ID":"767123c8-e6d2-4986-9b3f-6331624aea7d","Type":"ContainerDied","Data":"cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c"} Apr 22 18:08:13.760213 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.760055 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c" event={"ID":"767123c8-e6d2-4986-9b3f-6331624aea7d","Type":"ContainerDied","Data":"ca6aafa9664b88692d578ec9b8557534934a1da09936b8dcfbf47695233f3d0f"} Apr 22 18:08:13.768094 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.768078 2583 scope.go:117] "RemoveContainer" containerID="fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d" Apr 22 18:08:13.777719 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.777698 2583 scope.go:117] "RemoveContainer" containerID="f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5" Apr 22 18:08:13.777974 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:08:13.777948 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5\": container with ID starting with f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5 not found: ID does not exist" containerID="f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5" Apr 22 18:08:13.778069 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.777980 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5"} err="failed to get container status \"f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5\": rpc error: code = NotFound desc = could not find container \"f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5\": container with ID starting with f5b02421d2413a18a5174b620e24395792e2eb90b9d2f9691fed5937900170c5 not found: ID does not exist" Apr 22 18:08:13.778069 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.777998 2583 scope.go:117] "RemoveContainer" containerID="fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d" Apr 22 18:08:13.778186 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:08:13.778170 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d\": container with ID starting with fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d not found: ID does not exist" containerID="fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d" Apr 22 18:08:13.778236 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.778191 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d"} err="failed to get container status \"fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d\": rpc error: code = NotFound desc = could not find container \"fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d\": container with ID starting with fec907750961d755094d637559e5ba93bd27af6d7f39d81aedde29188462189d not found: ID does not exist" Apr 22 18:08:13.778236 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.778205 2583 scope.go:117] "RemoveContainer" containerID="977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345" Apr 22 18:08:13.781458 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.781436 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv"] Apr 22 18:08:13.786285 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.786268 2583 scope.go:117] "RemoveContainer" containerID="cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c" Apr 22 18:08:13.788610 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.788589 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-694b86bf89-cd8kv"] Apr 22 18:08:13.793449 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.793429 2583 scope.go:117] "RemoveContainer" containerID="6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0" Apr 22 18:08:13.800492 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.800467 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c"] Apr 22 18:08:13.801757 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.801724 2583 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-dshm\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.801757 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.801754 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m7rgj\" (UniqueName: \"kubernetes.io/projected/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kube-api-access-m7rgj\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.801918 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.801770 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-uds\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.801918 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.801785 2583 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-tls-certs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.801918 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.801793 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-tokenizer-tmp\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.801918 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.801802 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ab79122-94e7-436e-b24a-b2fb4c15f1b8-kserve-provision-location\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.801918 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.801730 2583 scope.go:117] "RemoveContainer" containerID="977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345" Apr 22 18:08:13.801918 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.801816 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtk52\" (UniqueName: \"kubernetes.io/projected/767123c8-e6d2-4986-9b3f-6331624aea7d-kube-api-access-rtk52\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.801918 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.801829 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/767123c8-e6d2-4986-9b3f-6331624aea7d-kserve-provision-location\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.801918 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.801883 2583 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/767123c8-e6d2-4986-9b3f-6331624aea7d-tls-certs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:13.802295 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:08:13.802152 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345\": container with ID starting with 977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345 not found: ID does not exist" containerID="977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345" Apr 22 18:08:13.802295 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.802178 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345"} err="failed to get container status \"977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345\": rpc error: code = NotFound desc = could not find container \"977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345\": container with ID starting with 977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345 not found: ID does not exist" Apr 22 18:08:13.802295 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.802199 2583 scope.go:117] "RemoveContainer" containerID="cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c" Apr 22 18:08:13.802464 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:08:13.802441 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c\": container with ID starting with cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c not found: ID does not exist" containerID="cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c" Apr 22 18:08:13.802517 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.802473 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c"} err="failed to get container status \"cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c\": rpc error: code = NotFound desc = could not find container \"cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c\": container with ID starting with cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c not found: ID does not exist" Apr 22 18:08:13.802517 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.802493 2583 scope.go:117] "RemoveContainer" containerID="6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0" Apr 22 18:08:13.802764 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:08:13.802746 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0\": container with ID starting with 6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0 not found: ID does not exist" containerID="6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0" Apr 22 18:08:13.802855 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.802768 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0"} err="failed to get container status \"6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0\": rpc error: code = NotFound desc = could not find container \"6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0\": container with ID starting with 6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0 not found: ID does not exist" Apr 22 18:08:13.802855 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.802785 2583 scope.go:117] "RemoveContainer" containerID="977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345" Apr 22 18:08:13.803086 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.803052 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345"} err="failed to get container status \"977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345\": rpc error: code = NotFound desc = could not find container \"977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345\": container with ID starting with 977912b7e9f85a1f75648b2b41024c4efb94ef1497fc233a77bbdb7dad998345 not found: ID does not exist" Apr 22 18:08:13.803086 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.803083 2583 scope.go:117] "RemoveContainer" containerID="cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c" Apr 22 18:08:13.803356 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.803332 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c"} err="failed to get container status \"cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c\": rpc error: code = NotFound desc = could not find container \"cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c\": container with ID starting with cc4ceb3678f8f791327df7b3b02827883e58e1d838a727db24d9bbcac036cc3c not found: ID does not exist" Apr 22 18:08:13.803423 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.803357 2583 scope.go:117] "RemoveContainer" containerID="6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0" Apr 22 18:08:13.803679 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.803616 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0"} err="failed to get container status \"6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0\": rpc error: code = NotFound desc = could not find container \"6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0\": container with ID starting with 6cd640a7fcc7bed758347e38fc687ad03f5b8f629d1c8834f6f636c26751dfb0 not found: ID does not exist" Apr 22 18:08:13.805949 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:13.805929 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6b46b76np98c"] Apr 22 18:08:14.665657 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:14.665610 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab79122-94e7-436e-b24a-b2fb4c15f1b8" path="/var/lib/kubelet/pods/2ab79122-94e7-436e-b24a-b2fb4c15f1b8/volumes" Apr 22 18:08:14.666052 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:14.666039 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" path="/var/lib/kubelet/pods/767123c8-e6d2-4986-9b3f-6331624aea7d/volumes" Apr 22 18:08:17.097409 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097368 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf"] Apr 22 18:08:17.097820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097746 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="tokenizer" Apr 22 18:08:17.097820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097758 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="tokenizer" Apr 22 18:08:17.097820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097775 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="main" Apr 22 18:08:17.097820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097781 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="main" Apr 22 18:08:17.097820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097789 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="storage-initializer" Apr 22 18:08:17.097820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097795 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="storage-initializer" Apr 22 18:08:17.097820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097802 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ab79122-94e7-436e-b24a-b2fb4c15f1b8" containerName="main" Apr 22 18:08:17.097820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097807 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab79122-94e7-436e-b24a-b2fb4c15f1b8" containerName="main" Apr 22 18:08:17.097820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097823 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ab79122-94e7-436e-b24a-b2fb4c15f1b8" containerName="storage-initializer" Apr 22 18:08:17.098165 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097829 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab79122-94e7-436e-b24a-b2fb4c15f1b8" containerName="storage-initializer" Apr 22 18:08:17.098165 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097884 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ab79122-94e7-436e-b24a-b2fb4c15f1b8" containerName="main" Apr 22 18:08:17.098165 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097894 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="main" Apr 22 18:08:17.098165 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.097900 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="767123c8-e6d2-4986-9b3f-6331624aea7d" containerName="tokenizer" Apr 22 18:08:17.104201 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.104173 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.107854 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.107830 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:08:17.109329 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.109307 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:08:17.109449 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.109378 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:08:17.109449 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.109379 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 18:08:17.112362 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.112341 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf"] Apr 22 18:08:17.131827 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.131802 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-model-cache\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.132036 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.132020 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-home\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.132168 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.132151 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96fff750-644f-4c8d-b2ae-4b92d0983162-tls-certs\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.132278 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.132256 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.132403 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.132287 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-dshm\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.132403 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.132309 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjr2j\" (UniqueName: \"kubernetes.io/projected/96fff750-644f-4c8d-b2ae-4b92d0983162-kube-api-access-sjr2j\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.233207 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.233178 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-model-cache\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.233363 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.233226 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-home\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.233363 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.233249 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96fff750-644f-4c8d-b2ae-4b92d0983162-tls-certs\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.233363 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.233269 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.233363 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.233285 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-dshm\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.233363 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.233305 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjr2j\" (UniqueName: \"kubernetes.io/projected/96fff750-644f-4c8d-b2ae-4b92d0983162-kube-api-access-sjr2j\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.233705 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.233583 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-model-cache\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.233705 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.233682 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.233798 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.233720 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-home\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.235764 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.235741 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-dshm\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.235881 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.235862 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96fff750-644f-4c8d-b2ae-4b92d0983162-tls-certs\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.242525 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.242501 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjr2j\" (UniqueName: \"kubernetes.io/projected/96fff750-644f-4c8d-b2ae-4b92d0983162-kube-api-access-sjr2j\") pod \"precise-prefix-cache-test-kserve-549d7668fc-gqncf\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.443697 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.443655 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:17.568068 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.568038 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf"] Apr 22 18:08:17.570713 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:08:17.570688 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96fff750_644f_4c8d_b2ae_4b92d0983162.slice/crio-ae6c78d32e4ce12bbeed20c6f8aeb612695a3b372746f3314e81397ffa36fc44 WatchSource:0}: Error finding container ae6c78d32e4ce12bbeed20c6f8aeb612695a3b372746f3314e81397ffa36fc44: Status 404 returned error can't find the container with id ae6c78d32e4ce12bbeed20c6f8aeb612695a3b372746f3314e81397ffa36fc44 Apr 22 18:08:17.776107 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.776017 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" event={"ID":"96fff750-644f-4c8d-b2ae-4b92d0983162","Type":"ContainerStarted","Data":"db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85"} Apr 22 18:08:17.776107 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:17.776066 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" event={"ID":"96fff750-644f-4c8d-b2ae-4b92d0983162","Type":"ContainerStarted","Data":"ae6c78d32e4ce12bbeed20c6f8aeb612695a3b372746f3314e81397ffa36fc44"} Apr 22 18:08:21.792251 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:21.792218 2583 generic.go:358] "Generic (PLEG): container finished" podID="96fff750-644f-4c8d-b2ae-4b92d0983162" containerID="db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85" exitCode=0 Apr 22 18:08:21.792576 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:21.792291 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" event={"ID":"96fff750-644f-4c8d-b2ae-4b92d0983162","Type":"ContainerDied","Data":"db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85"} Apr 22 18:08:22.796974 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:22.796936 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" event={"ID":"96fff750-644f-4c8d-b2ae-4b92d0983162","Type":"ContainerStarted","Data":"107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f"} Apr 22 18:08:22.819832 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:22.819781 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" podStartSLOduration=5.819764747 podStartE2EDuration="5.819764747s" podCreationTimestamp="2026-04-22 18:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:08:22.817164448 +0000 UTC m=+884.694433653" watchObservedRunningTime="2026-04-22 18:08:22.819764747 +0000 UTC m=+884.697033915" Apr 22 18:08:27.444161 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:27.444114 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:27.444161 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:27.444170 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:27.457673 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:27.457646 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:27.824730 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:27.824655 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:38.611546 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:38.611518 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 18:08:38.612070 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:38.612051 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 18:08:50.966558 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:50.966521 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf"] Apr 22 18:08:50.966997 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:50.966913 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" podUID="96fff750-644f-4c8d-b2ae-4b92d0983162" containerName="main" containerID="cri-o://107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f" gracePeriod=30 Apr 22 18:08:51.211053 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.211031 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:51.346399 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.346318 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96fff750-644f-4c8d-b2ae-4b92d0983162-tls-certs\") pod \"96fff750-644f-4c8d-b2ae-4b92d0983162\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " Apr 22 18:08:51.346399 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.346363 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-home\") pod \"96fff750-644f-4c8d-b2ae-4b92d0983162\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " Apr 22 18:08:51.346619 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.346467 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-kserve-provision-location\") pod \"96fff750-644f-4c8d-b2ae-4b92d0983162\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " Apr 22 18:08:51.346619 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.346514 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-model-cache\") pod \"96fff750-644f-4c8d-b2ae-4b92d0983162\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " Apr 22 18:08:51.346619 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.346566 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-dshm\") pod \"96fff750-644f-4c8d-b2ae-4b92d0983162\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " Apr 22 18:08:51.346619 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.346610 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjr2j\" (UniqueName: \"kubernetes.io/projected/96fff750-644f-4c8d-b2ae-4b92d0983162-kube-api-access-sjr2j\") pod \"96fff750-644f-4c8d-b2ae-4b92d0983162\" (UID: \"96fff750-644f-4c8d-b2ae-4b92d0983162\") " Apr 22 18:08:51.346619 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.346613 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-home" (OuterVolumeSpecName: "home") pod "96fff750-644f-4c8d-b2ae-4b92d0983162" (UID: "96fff750-644f-4c8d-b2ae-4b92d0983162"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:51.346905 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.346827 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-model-cache" (OuterVolumeSpecName: "model-cache") pod "96fff750-644f-4c8d-b2ae-4b92d0983162" (UID: "96fff750-644f-4c8d-b2ae-4b92d0983162"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:51.346971 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.346942 2583 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-model-cache\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:51.346971 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.346962 2583 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-home\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:51.348758 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.348707 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96fff750-644f-4c8d-b2ae-4b92d0983162-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "96fff750-644f-4c8d-b2ae-4b92d0983162" (UID: "96fff750-644f-4c8d-b2ae-4b92d0983162"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:08:51.348900 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.348788 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96fff750-644f-4c8d-b2ae-4b92d0983162-kube-api-access-sjr2j" (OuterVolumeSpecName: "kube-api-access-sjr2j") pod "96fff750-644f-4c8d-b2ae-4b92d0983162" (UID: "96fff750-644f-4c8d-b2ae-4b92d0983162"). InnerVolumeSpecName "kube-api-access-sjr2j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:08:51.349279 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.349250 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-dshm" (OuterVolumeSpecName: "dshm") pod "96fff750-644f-4c8d-b2ae-4b92d0983162" (UID: "96fff750-644f-4c8d-b2ae-4b92d0983162"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:51.401742 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.401680 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "96fff750-644f-4c8d-b2ae-4b92d0983162" (UID: "96fff750-644f-4c8d-b2ae-4b92d0983162"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:51.447979 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.447947 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sjr2j\" (UniqueName: \"kubernetes.io/projected/96fff750-644f-4c8d-b2ae-4b92d0983162-kube-api-access-sjr2j\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:51.447979 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.447973 2583 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/96fff750-644f-4c8d-b2ae-4b92d0983162-tls-certs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:51.447979 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.447985 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-kserve-provision-location\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:51.448203 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.447997 2583 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/96fff750-644f-4c8d-b2ae-4b92d0983162-dshm\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:08:51.890954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.890921 2583 generic.go:358] "Generic (PLEG): container finished" podID="96fff750-644f-4c8d-b2ae-4b92d0983162" containerID="107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f" exitCode=0 Apr 22 18:08:51.891129 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.890971 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" event={"ID":"96fff750-644f-4c8d-b2ae-4b92d0983162","Type":"ContainerDied","Data":"107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f"} Apr 22 18:08:51.891129 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.890999 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" event={"ID":"96fff750-644f-4c8d-b2ae-4b92d0983162","Type":"ContainerDied","Data":"ae6c78d32e4ce12bbeed20c6f8aeb612695a3b372746f3314e81397ffa36fc44"} Apr 22 18:08:51.891129 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.890998 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf" Apr 22 18:08:51.891129 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.891062 2583 scope.go:117] "RemoveContainer" containerID="107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f" Apr 22 18:08:51.908387 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.908368 2583 scope.go:117] "RemoveContainer" containerID="db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85" Apr 22 18:08:51.919784 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.919759 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf"] Apr 22 18:08:51.923549 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.923528 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-gqncf"] Apr 22 18:08:51.925039 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.925023 2583 scope.go:117] "RemoveContainer" containerID="107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f" Apr 22 18:08:51.925297 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:08:51.925277 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f\": container with ID starting with 107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f not found: ID does not exist" containerID="107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f" Apr 22 18:08:51.925362 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.925304 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f"} err="failed to get container status \"107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f\": rpc error: code = NotFound desc = could not find container \"107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f\": container with ID starting with 107ac9c245a1bf4dcd16568ede7fcfa08d1cbac74f4a71cb54a1c5535d579f5f not found: ID does not exist" Apr 22 18:08:51.925362 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.925326 2583 scope.go:117] "RemoveContainer" containerID="db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85" Apr 22 18:08:51.925594 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:08:51.925575 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85\": container with ID starting with db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85 not found: ID does not exist" containerID="db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85" Apr 22 18:08:51.925667 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:51.925601 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85"} err="failed to get container status \"db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85\": rpc error: code = NotFound desc = could not find container \"db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85\": container with ID starting with db7a5eef1e2c1a0ae9d39b4a68d27f09b1e245ba05f68ddfbab9e3fa3d02ff85 not found: ID does not exist" Apr 22 18:08:52.665793 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:08:52.665760 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96fff750-644f-4c8d-b2ae-4b92d0983162" path="/var/lib/kubelet/pods/96fff750-644f-4c8d-b2ae-4b92d0983162/volumes" Apr 22 18:09:02.782493 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.782459 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789"] Apr 22 18:09:02.783060 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.783030 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96fff750-644f-4c8d-b2ae-4b92d0983162" containerName="storage-initializer" Apr 22 18:09:02.783060 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.783046 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fff750-644f-4c8d-b2ae-4b92d0983162" containerName="storage-initializer" Apr 22 18:09:02.783060 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.783060 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96fff750-644f-4c8d-b2ae-4b92d0983162" containerName="main" Apr 22 18:09:02.783189 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.783066 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fff750-644f-4c8d-b2ae-4b92d0983162" containerName="main" Apr 22 18:09:02.783189 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.783130 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="96fff750-644f-4c8d-b2ae-4b92d0983162" containerName="main" Apr 22 18:09:02.785359 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.785329 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.787822 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.787790 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:09:02.788053 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.788034 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:09:02.788200 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.788064 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-58qfs\"" Apr 22 18:09:02.789124 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.789105 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 18:09:02.789221 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.789109 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:09:02.794467 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.794438 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789"] Apr 22 18:09:02.854019 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.853982 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.854203 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.854042 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.854203 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.854114 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1c5db9-f152-445c-a595-734ebefa34b7-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.854203 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.854160 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.854203 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.854182 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.854354 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.854221 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9gmp\" (UniqueName: \"kubernetes.io/projected/8d1c5db9-f152-445c-a595-734ebefa34b7-kube-api-access-z9gmp\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.955761 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.955717 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.956004 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.955804 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1c5db9-f152-445c-a595-734ebefa34b7-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.956004 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.955838 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.956004 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.955871 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.956004 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.955923 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9gmp\" (UniqueName: \"kubernetes.io/projected/8d1c5db9-f152-445c-a595-734ebefa34b7-kube-api-access-z9gmp\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.956004 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.955956 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.956283 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.956184 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.956283 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.956249 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.956361 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.956284 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.956361 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.956309 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.958301 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.958280 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1c5db9-f152-445c-a595-734ebefa34b7-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:02.963910 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:02.963884 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9gmp\" (UniqueName: \"kubernetes.io/projected/8d1c5db9-f152-445c-a595-734ebefa34b7-kube-api-access-z9gmp\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-dg789\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:03.096691 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:03.096573 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:03.249497 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:03.249470 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789"] Apr 22 18:09:03.251959 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:09:03.251933 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d1c5db9_f152_445c_a595_734ebefa34b7.slice/crio-dd02fd25bd5df8bab03257486effd42ad365f32e405ee6ef41a376962648c627 WatchSource:0}: Error finding container dd02fd25bd5df8bab03257486effd42ad365f32e405ee6ef41a376962648c627: Status 404 returned error can't find the container with id dd02fd25bd5df8bab03257486effd42ad365f32e405ee6ef41a376962648c627 Apr 22 18:09:03.934158 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:03.934116 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" event={"ID":"8d1c5db9-f152-445c-a595-734ebefa34b7","Type":"ContainerStarted","Data":"fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a"} Apr 22 18:09:03.934158 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:03.934161 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" event={"ID":"8d1c5db9-f152-445c-a595-734ebefa34b7","Type":"ContainerStarted","Data":"dd02fd25bd5df8bab03257486effd42ad365f32e405ee6ef41a376962648c627"} Apr 22 18:09:04.939283 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:04.939241 2583 generic.go:358] "Generic (PLEG): container finished" podID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerID="fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a" exitCode=0 Apr 22 18:09:04.939805 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:04.939336 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" event={"ID":"8d1c5db9-f152-445c-a595-734ebefa34b7","Type":"ContainerDied","Data":"fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a"} Apr 22 18:09:05.945124 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:05.945081 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" event={"ID":"8d1c5db9-f152-445c-a595-734ebefa34b7","Type":"ContainerStarted","Data":"1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b"} Apr 22 18:09:05.945124 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:05.945127 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" event={"ID":"8d1c5db9-f152-445c-a595-734ebefa34b7","Type":"ContainerStarted","Data":"cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8"} Apr 22 18:09:05.945563 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:05.945227 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:05.967874 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:05.967823 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" podStartSLOduration=3.967806638 podStartE2EDuration="3.967806638s" podCreationTimestamp="2026-04-22 18:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:09:05.96540352 +0000 UTC m=+927.842672697" watchObservedRunningTime="2026-04-22 18:09:05.967806638 +0000 UTC m=+927.845075805" Apr 22 18:09:13.097115 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:13.097082 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:13.097613 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:13.097128 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:13.099796 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:13.099769 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:13.974811 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:13.974778 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:09:34.979613 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:09:34.979580 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:11:14.982805 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:14.982764 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789"] Apr 22 18:11:14.983303 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:14.983149 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" podUID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerName="main" containerID="cri-o://cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8" gracePeriod=30 Apr 22 18:11:14.983503 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:14.983443 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" podUID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerName="tokenizer" containerID="cri-o://1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b" gracePeriod=30 Apr 22 18:11:15.398550 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:15.398464 2583 generic.go:358] "Generic (PLEG): container finished" podID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerID="cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8" exitCode=0 Apr 22 18:11:15.398550 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:15.398535 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" event={"ID":"8d1c5db9-f152-445c-a595-734ebefa34b7","Type":"ContainerDied","Data":"cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8"} Apr 22 18:11:16.230087 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.230060 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:11:16.351955 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.351872 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-tmp\") pod \"8d1c5db9-f152-445c-a595-734ebefa34b7\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " Apr 22 18:11:16.351955 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.351912 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1c5db9-f152-445c-a595-734ebefa34b7-tls-certs\") pod \"8d1c5db9-f152-445c-a595-734ebefa34b7\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " Apr 22 18:11:16.352162 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.351962 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-kserve-provision-location\") pod \"8d1c5db9-f152-445c-a595-734ebefa34b7\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " Apr 22 18:11:16.352162 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.351999 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9gmp\" (UniqueName: \"kubernetes.io/projected/8d1c5db9-f152-445c-a595-734ebefa34b7-kube-api-access-z9gmp\") pod \"8d1c5db9-f152-445c-a595-734ebefa34b7\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " Apr 22 18:11:16.352162 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.352026 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-uds\") pod \"8d1c5db9-f152-445c-a595-734ebefa34b7\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " Apr 22 18:11:16.352162 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.352111 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-cache\") pod \"8d1c5db9-f152-445c-a595-734ebefa34b7\" (UID: \"8d1c5db9-f152-445c-a595-734ebefa34b7\") " Apr 22 18:11:16.352402 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.352273 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8d1c5db9-f152-445c-a595-734ebefa34b7" (UID: "8d1c5db9-f152-445c-a595-734ebefa34b7"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:16.352402 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.352257 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8d1c5db9-f152-445c-a595-734ebefa34b7" (UID: "8d1c5db9-f152-445c-a595-734ebefa34b7"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:16.352402 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.352367 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8d1c5db9-f152-445c-a595-734ebefa34b7" (UID: "8d1c5db9-f152-445c-a595-734ebefa34b7"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:16.352553 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.352472 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-tmp\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:11:16.352553 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.352493 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-uds\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:11:16.352553 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.352507 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-tokenizer-cache\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:11:16.352791 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.352769 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8d1c5db9-f152-445c-a595-734ebefa34b7" (UID: "8d1c5db9-f152-445c-a595-734ebefa34b7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:16.354192 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.354162 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1c5db9-f152-445c-a595-734ebefa34b7-kube-api-access-z9gmp" (OuterVolumeSpecName: "kube-api-access-z9gmp") pod "8d1c5db9-f152-445c-a595-734ebefa34b7" (UID: "8d1c5db9-f152-445c-a595-734ebefa34b7"). InnerVolumeSpecName "kube-api-access-z9gmp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:11:16.354294 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.354217 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1c5db9-f152-445c-a595-734ebefa34b7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8d1c5db9-f152-445c-a595-734ebefa34b7" (UID: "8d1c5db9-f152-445c-a595-734ebefa34b7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:11:16.402976 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.402945 2583 generic.go:358] "Generic (PLEG): container finished" podID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerID="1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b" exitCode=0 Apr 22 18:11:16.403149 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.403027 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" Apr 22 18:11:16.403149 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.403026 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" event={"ID":"8d1c5db9-f152-445c-a595-734ebefa34b7","Type":"ContainerDied","Data":"1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b"} Apr 22 18:11:16.403149 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.403131 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789" event={"ID":"8d1c5db9-f152-445c-a595-734ebefa34b7","Type":"ContainerDied","Data":"dd02fd25bd5df8bab03257486effd42ad365f32e405ee6ef41a376962648c627"} Apr 22 18:11:16.403149 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.403148 2583 scope.go:117] "RemoveContainer" containerID="1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b" Apr 22 18:11:16.412312 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.412294 2583 scope.go:117] "RemoveContainer" containerID="cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8" Apr 22 18:11:16.419399 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.419382 2583 scope.go:117] "RemoveContainer" containerID="fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a" Apr 22 18:11:16.425779 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.425757 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789"] Apr 22 18:11:16.427150 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.427133 2583 scope.go:117] "RemoveContainer" containerID="1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b" Apr 22 18:11:16.427390 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:11:16.427371 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b\": container with ID starting with 1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b not found: ID does not exist" containerID="1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b" Apr 22 18:11:16.427455 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.427399 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b"} err="failed to get container status \"1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b\": rpc error: code = NotFound desc = could not find container \"1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b\": container with ID starting with 1d76869f608ad43d9c87b7b54d23b486acfeb80f6cf768888bf7c771698f882b not found: ID does not exist" Apr 22 18:11:16.427455 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.427419 2583 scope.go:117] "RemoveContainer" containerID="cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8" Apr 22 18:11:16.427772 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:11:16.427747 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8\": container with ID starting with cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8 not found: ID does not exist" containerID="cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8" Apr 22 18:11:16.427909 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.427787 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8"} err="failed to get container status \"cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8\": rpc error: code = NotFound desc = could not find container \"cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8\": container with ID starting with cc7c38d3aaa4c15444746997edfab128d681bde02766c00183cc0ade203723f8 not found: ID does not exist" Apr 22 18:11:16.427909 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.427810 2583 scope.go:117] "RemoveContainer" containerID="fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a" Apr 22 18:11:16.428351 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:11:16.428209 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a\": container with ID starting with fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a not found: ID does not exist" containerID="fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a" Apr 22 18:11:16.428351 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.428278 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a"} err="failed to get container status \"fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a\": rpc error: code = NotFound desc = could not find container \"fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a\": container with ID starting with fb2593b9054f4082352df3fa9fb7db86fff6e912eebbecdae02237b63d0f1b6a not found: ID does not exist" Apr 22 18:11:16.430004 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.429982 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-dg789"] Apr 22 18:11:16.453826 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.453807 2583 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1c5db9-f152-445c-a595-734ebefa34b7-tls-certs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:11:16.453906 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.453829 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d1c5db9-f152-445c-a595-734ebefa34b7-kserve-provision-location\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:11:16.453906 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.453840 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z9gmp\" (UniqueName: \"kubernetes.io/projected/8d1c5db9-f152-445c-a595-734ebefa34b7-kube-api-access-z9gmp\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:11:16.665988 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:16.665911 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1c5db9-f152-445c-a595-734ebefa34b7" path="/var/lib/kubelet/pods/8d1c5db9-f152-445c-a595-734ebefa34b7/volumes" Apr 22 18:11:17.527705 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.527667 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb"] Apr 22 18:11:17.528090 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.528037 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerName="storage-initializer" Apr 22 18:11:17.528090 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.528049 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerName="storage-initializer" Apr 22 18:11:17.528090 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.528058 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerName="tokenizer" Apr 22 18:11:17.528090 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.528064 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerName="tokenizer" Apr 22 18:11:17.528090 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.528076 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerName="main" Apr 22 18:11:17.528090 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.528081 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerName="main" Apr 22 18:11:17.528271 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.528139 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerName="main" Apr 22 18:11:17.528271 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.528148 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d1c5db9-f152-445c-a595-734ebefa34b7" containerName="tokenizer" Apr 22 18:11:17.533231 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.533207 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.535917 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.535892 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 18:11:17.537264 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.537244 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:11:17.537379 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.537288 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-fwlzn\"" Apr 22 18:11:17.537379 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.537304 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:11:17.537379 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.537369 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:11:17.542661 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.542617 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb"] Apr 22 18:11:17.563974 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.563943 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p99cr\" (UniqueName: \"kubernetes.io/projected/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kube-api-access-p99cr\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.564092 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.563980 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.564092 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.564016 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.564092 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.564042 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.564092 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.564087 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.564222 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.564105 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.665467 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.665433 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.665467 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.665472 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.665745 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.665522 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p99cr\" (UniqueName: \"kubernetes.io/projected/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kube-api-access-p99cr\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.665745 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.665543 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.665745 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.665562 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.665745 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.665580 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.665981 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.665960 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.666038 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.665978 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.666038 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.666025 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.666106 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.666043 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.667981 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.667960 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.681435 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.681407 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p99cr\" (UniqueName: \"kubernetes.io/projected/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kube-api-access-p99cr\") pod \"stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.845882 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.845788 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:17.974060 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.974030 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb"] Apr 22 18:11:17.976011 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:11:17.975986 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5747428_2ed2_454f_9925_e4e1ccd12a3f.slice/crio-9fcc54f007538dfc8f3d3840f5bbdf67a339b4f2cdff4c843fe7291c4064d29d WatchSource:0}: Error finding container 9fcc54f007538dfc8f3d3840f5bbdf67a339b4f2cdff4c843fe7291c4064d29d: Status 404 returned error can't find the container with id 9fcc54f007538dfc8f3d3840f5bbdf67a339b4f2cdff4c843fe7291c4064d29d Apr 22 18:11:17.977901 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:17.977883 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:11:18.414502 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:18.414468 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" event={"ID":"a5747428-2ed2-454f-9925-e4e1ccd12a3f","Type":"ContainerStarted","Data":"aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78"} Apr 22 18:11:18.414502 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:18.414507 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" event={"ID":"a5747428-2ed2-454f-9925-e4e1ccd12a3f","Type":"ContainerStarted","Data":"9fcc54f007538dfc8f3d3840f5bbdf67a339b4f2cdff4c843fe7291c4064d29d"} Apr 22 18:11:19.418512 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:19.418476 2583 generic.go:358] "Generic (PLEG): container finished" podID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerID="aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78" exitCode=0 Apr 22 18:11:19.418881 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:19.418528 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" event={"ID":"a5747428-2ed2-454f-9925-e4e1ccd12a3f","Type":"ContainerDied","Data":"aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78"} Apr 22 18:11:20.424276 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:20.424242 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" event={"ID":"a5747428-2ed2-454f-9925-e4e1ccd12a3f","Type":"ContainerStarted","Data":"d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570"} Apr 22 18:11:20.424276 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:20.424281 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" event={"ID":"a5747428-2ed2-454f-9925-e4e1ccd12a3f","Type":"ContainerStarted","Data":"e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb"} Apr 22 18:11:20.424688 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:20.424369 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:20.445424 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:20.445360 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" podStartSLOduration=3.445342676 podStartE2EDuration="3.445342676s" podCreationTimestamp="2026-04-22 18:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:11:20.444199905 +0000 UTC m=+1062.321469127" watchObservedRunningTime="2026-04-22 18:11:20.445342676 +0000 UTC m=+1062.322611844" Apr 22 18:11:27.846388 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:27.846294 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:27.846388 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:27.846343 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:27.849046 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:27.849021 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:28.456367 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:28.456339 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:11:49.460743 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:11:49.460715 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:13:24.713888 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:24.713852 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-f7554779f-5xljv"] Apr 22 18:13:24.716531 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:24.716508 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" Apr 22 18:13:24.720195 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:24.720176 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-gj4th\"" Apr 22 18:13:24.720310 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:24.720180 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:13:24.727753 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:24.727730 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-f7554779f-5xljv"] Apr 22 18:13:24.784140 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:24.784102 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gfrz\" (UniqueName: \"kubernetes.io/projected/b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f-kube-api-access-8gfrz\") pod \"llmisvc-controller-manager-f7554779f-5xljv\" (UID: \"b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f\") " pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" Apr 22 18:13:24.784292 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:24.784222 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f-cert\") pod \"llmisvc-controller-manager-f7554779f-5xljv\" (UID: \"b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f\") " pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" Apr 22 18:13:24.885618 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:24.885580 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gfrz\" (UniqueName: \"kubernetes.io/projected/b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f-kube-api-access-8gfrz\") pod \"llmisvc-controller-manager-f7554779f-5xljv\" (UID: \"b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f\") " pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" Apr 22 18:13:24.885819 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:24.885692 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f-cert\") pod \"llmisvc-controller-manager-f7554779f-5xljv\" (UID: \"b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f\") " pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" Apr 22 18:13:24.888056 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:24.888035 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f-cert\") pod \"llmisvc-controller-manager-f7554779f-5xljv\" (UID: \"b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f\") " pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" Apr 22 18:13:24.893422 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:24.893399 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gfrz\" (UniqueName: \"kubernetes.io/projected/b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f-kube-api-access-8gfrz\") pod \"llmisvc-controller-manager-f7554779f-5xljv\" (UID: \"b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f\") " pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" Apr 22 18:13:25.027219 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:25.027126 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" Apr 22 18:13:25.155180 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:25.155155 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-f7554779f-5xljv"] Apr 22 18:13:25.157123 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:13:25.157091 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb9ba060a_9eb7_4c6e_ae1b_fb935c3a9d6f.slice/crio-96b61d2d9c3498bc0f4a1b53362d5eb8193d2e3aa6eab3428722330c27cb4f9a WatchSource:0}: Error finding container 96b61d2d9c3498bc0f4a1b53362d5eb8193d2e3aa6eab3428722330c27cb4f9a: Status 404 returned error can't find the container with id 96b61d2d9c3498bc0f4a1b53362d5eb8193d2e3aa6eab3428722330c27cb4f9a Apr 22 18:13:25.851951 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:25.851912 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" event={"ID":"b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f","Type":"ContainerStarted","Data":"96b61d2d9c3498bc0f4a1b53362d5eb8193d2e3aa6eab3428722330c27cb4f9a"} Apr 22 18:13:29.867296 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:29.867257 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" event={"ID":"b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f","Type":"ContainerStarted","Data":"b50b0eb022eb6884234a3c5ebc6b073af28ed2696ecc0219bc296fb528323e28"} Apr 22 18:13:29.867685 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:29.867345 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" Apr 22 18:13:29.883960 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:29.883905 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" podStartSLOduration=1.858592021 podStartE2EDuration="5.88388919s" podCreationTimestamp="2026-04-22 18:13:24 +0000 UTC" firstStartedPulling="2026-04-22 18:13:25.158474868 +0000 UTC m=+1187.035744014" lastFinishedPulling="2026-04-22 18:13:29.183772034 +0000 UTC m=+1191.061041183" observedRunningTime="2026-04-22 18:13:29.882238262 +0000 UTC m=+1191.759507431" watchObservedRunningTime="2026-04-22 18:13:29.88388919 +0000 UTC m=+1191.761158394" Apr 22 18:13:30.806994 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:30.806961 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb"] Apr 22 18:13:30.807312 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:30.807262 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" podUID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerName="main" containerID="cri-o://e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb" gracePeriod=30 Apr 22 18:13:30.807390 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:30.807294 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" podUID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerName="tokenizer" containerID="cri-o://d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570" gracePeriod=30 Apr 22 18:13:31.877336 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:31.877304 2583 generic.go:358] "Generic (PLEG): container finished" podID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerID="e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb" exitCode=0 Apr 22 18:13:31.877336 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:31.877344 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" event={"ID":"a5747428-2ed2-454f-9925-e4e1ccd12a3f","Type":"ContainerDied","Data":"e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb"} Apr 22 18:13:32.053949 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.053927 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:13:32.152650 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.152547 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tls-certs\") pod \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " Apr 22 18:13:32.152820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.152680 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-tmp\") pod \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " Apr 22 18:13:32.152820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.152710 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kserve-provision-location\") pod \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " Apr 22 18:13:32.152820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.152754 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p99cr\" (UniqueName: \"kubernetes.io/projected/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kube-api-access-p99cr\") pod \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " Apr 22 18:13:32.152820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.152782 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-uds\") pod \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " Apr 22 18:13:32.152820 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.152808 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-cache\") pod \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\" (UID: \"a5747428-2ed2-454f-9925-e4e1ccd12a3f\") " Apr 22 18:13:32.153114 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.153091 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a5747428-2ed2-454f-9925-e4e1ccd12a3f" (UID: "a5747428-2ed2-454f-9925-e4e1ccd12a3f"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:32.153174 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.153106 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a5747428-2ed2-454f-9925-e4e1ccd12a3f" (UID: "a5747428-2ed2-454f-9925-e4e1ccd12a3f"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:32.153174 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.153117 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a5747428-2ed2-454f-9925-e4e1ccd12a3f" (UID: "a5747428-2ed2-454f-9925-e4e1ccd12a3f"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:32.153477 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.153452 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a5747428-2ed2-454f-9925-e4e1ccd12a3f" (UID: "a5747428-2ed2-454f-9925-e4e1ccd12a3f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:32.154868 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.154846 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a5747428-2ed2-454f-9925-e4e1ccd12a3f" (UID: "a5747428-2ed2-454f-9925-e4e1ccd12a3f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:13:32.154868 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.154859 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kube-api-access-p99cr" (OuterVolumeSpecName: "kube-api-access-p99cr") pod "a5747428-2ed2-454f-9925-e4e1ccd12a3f" (UID: "a5747428-2ed2-454f-9925-e4e1ccd12a3f"). InnerVolumeSpecName "kube-api-access-p99cr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:13:32.254214 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.254173 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-tmp\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:13:32.254214 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.254206 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kserve-provision-location\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:13:32.254214 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.254216 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p99cr\" (UniqueName: \"kubernetes.io/projected/a5747428-2ed2-454f-9925-e4e1ccd12a3f-kube-api-access-p99cr\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:13:32.254214 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.254226 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-uds\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:13:32.254472 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.254235 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tokenizer-cache\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:13:32.254472 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.254244 2583 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a5747428-2ed2-454f-9925-e4e1ccd12a3f-tls-certs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:13:32.882942 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.882906 2583 generic.go:358] "Generic (PLEG): container finished" podID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerID="d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570" exitCode=0 Apr 22 18:13:32.883349 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.882950 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" event={"ID":"a5747428-2ed2-454f-9925-e4e1ccd12a3f","Type":"ContainerDied","Data":"d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570"} Apr 22 18:13:32.883349 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.882977 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" event={"ID":"a5747428-2ed2-454f-9925-e4e1ccd12a3f","Type":"ContainerDied","Data":"9fcc54f007538dfc8f3d3840f5bbdf67a339b4f2cdff4c843fe7291c4064d29d"} Apr 22 18:13:32.883349 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.882993 2583 scope.go:117] "RemoveContainer" containerID="d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570" Apr 22 18:13:32.883349 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.883006 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb" Apr 22 18:13:32.891557 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.891541 2583 scope.go:117] "RemoveContainer" containerID="e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb" Apr 22 18:13:32.898581 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.898561 2583 scope.go:117] "RemoveContainer" containerID="aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78" Apr 22 18:13:32.902213 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.902187 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb"] Apr 22 18:13:32.906495 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.906474 2583 scope.go:117] "RemoveContainer" containerID="d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570" Apr 22 18:13:32.906934 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:13:32.906914 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570\": container with ID starting with d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570 not found: ID does not exist" containerID="d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570" Apr 22 18:13:32.907035 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.906946 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570"} err="failed to get container status \"d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570\": rpc error: code = NotFound desc = could not find container \"d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570\": container with ID starting with d137e6eee03d6fbeed09ca19e09a4231fa052e679d455e1df2134a5840891570 not found: ID does not exist" Apr 22 18:13:32.907035 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.906971 2583 scope.go:117] "RemoveContainer" containerID="e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb" Apr 22 18:13:32.907255 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:13:32.907236 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb\": container with ID starting with e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb not found: ID does not exist" containerID="e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb" Apr 22 18:13:32.907328 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.907264 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb"} err="failed to get container status \"e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb\": rpc error: code = NotFound desc = could not find container \"e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb\": container with ID starting with e1c403468a8ea26c93532e8052b078b7f626b7794f4f3b9c47386fd3ea6feebb not found: ID does not exist" Apr 22 18:13:32.907328 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.907283 2583 scope.go:117] "RemoveContainer" containerID="aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78" Apr 22 18:13:32.907562 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:13:32.907543 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78\": container with ID starting with aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78 not found: ID does not exist" containerID="aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78" Apr 22 18:13:32.907611 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.907567 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78"} err="failed to get container status \"aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78\": rpc error: code = NotFound desc = could not find container \"aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78\": container with ID starting with aa6c811855e84a40bdb157524be73680a37d939a4842d7503e8ac90cb4ff5d78 not found: ID does not exist" Apr 22 18:13:32.907706 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:32.907689 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-75f56699f6-f4cnb"] Apr 22 18:13:34.665537 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:34.665499 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" path="/var/lib/kubelet/pods/a5747428-2ed2-454f-9925-e4e1ccd12a3f/volumes" Apr 22 18:13:38.637597 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:38.637566 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 18:13:38.638040 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:13:38.637566 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 18:14:00.874029 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:14:00.873993 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-f7554779f-5xljv" Apr 22 18:16:45.536620 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.536581 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:16:45.537126 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.536968 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerName="main" Apr 22 18:16:45.537126 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.536981 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerName="main" Apr 22 18:16:45.537126 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.536994 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerName="tokenizer" Apr 22 18:16:45.537126 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.537000 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerName="tokenizer" Apr 22 18:16:45.537126 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.537008 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerName="storage-initializer" Apr 22 18:16:45.537126 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.537015 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerName="storage-initializer" Apr 22 18:16:45.537126 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.537085 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerName="tokenizer" Apr 22 18:16:45.537126 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.537092 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5747428-2ed2-454f-9925-e4e1ccd12a3f" containerName="main" Apr 22 18:16:45.539395 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.539379 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.546406 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.546372 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:16:45.547717 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.547688 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:16:45.547717 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.547711 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:16:45.547874 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.547734 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-v2cmz\"" Apr 22 18:16:45.547874 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.547708 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 18:16:45.559007 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.558985 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:16:45.683677 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.683641 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.683843 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.683713 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad422670-1bc4-41e7-9c1c-221d8f956cf1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.683843 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.683742 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn4qc\" (UniqueName: \"kubernetes.io/projected/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kube-api-access-zn4qc\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.683843 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.683782 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.683952 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.683858 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.683952 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.683893 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.721755 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.721719 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd"] Apr 22 18:16:45.724541 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.724519 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.735695 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.735672 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-wcdwb\"" Apr 22 18:16:45.784381 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.784347 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zn4qc\" (UniqueName: \"kubernetes.io/projected/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kube-api-access-zn4qc\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.784551 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.784400 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.784551 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.784442 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.784551 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.784472 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.784551 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.784536 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.784778 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.784649 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad422670-1bc4-41e7-9c1c-221d8f956cf1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.784860 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.784837 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.784860 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.784855 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.784975 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.784957 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.786806 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.786752 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.786973 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.786956 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad422670-1bc4-41e7-9c1c-221d8f956cf1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.815280 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.815241 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn4qc\" (UniqueName: \"kubernetes.io/projected/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kube-api-access-zn4qc\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.849854 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.849823 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:16:45.886028 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.885995 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.886196 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.886036 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.886196 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.886085 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.886196 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.886111 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnhvs\" (UniqueName: \"kubernetes.io/projected/9408a411-2453-4624-9714-942b060a03eb-kube-api-access-cnhvs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.886315 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.886258 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9408a411-2453-4624-9714-942b060a03eb-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.886315 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.886303 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.906925 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.906896 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd"] Apr 22 18:16:45.987616 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.987575 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9408a411-2453-4624-9714-942b060a03eb-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.987807 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.987646 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.987807 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.987684 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.987807 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.987704 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.987807 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.987737 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.988000 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.987867 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnhvs\" (UniqueName: \"kubernetes.io/projected/9408a411-2453-4624-9714-942b060a03eb-kube-api-access-cnhvs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.988071 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.988051 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.988123 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.988100 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.988172 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.988147 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.988223 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.988179 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:45.990193 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:45.990175 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9408a411-2453-4624-9714-942b060a03eb-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:46.010756 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:46.010728 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:16:46.013651 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:46.013616 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:16:46.014393 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:46.014368 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnhvs\" (UniqueName: \"kubernetes.io/projected/9408a411-2453-4624-9714-942b060a03eb-kube-api-access-cnhvs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:46.033795 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:46.033774 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:46.194906 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:46.194860 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd"] Apr 22 18:16:46.201986 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:16:46.201940 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9408a411_2453_4624_9714_942b060a03eb.slice/crio-4210f5a4859db8d13f4f7baf2971b9aeeafdce96d36ab16f7ee913475f3341d9 WatchSource:0}: Error finding container 4210f5a4859db8d13f4f7baf2971b9aeeafdce96d36ab16f7ee913475f3341d9: Status 404 returned error can't find the container with id 4210f5a4859db8d13f4f7baf2971b9aeeafdce96d36ab16f7ee913475f3341d9 Apr 22 18:16:46.548174 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:46.548134 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"ad422670-1bc4-41e7-9c1c-221d8f956cf1","Type":"ContainerStarted","Data":"2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18"} Apr 22 18:16:46.548620 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:46.548182 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"ad422670-1bc4-41e7-9c1c-221d8f956cf1","Type":"ContainerStarted","Data":"21ccb98e906f2dfb14c39578a6dc0453852af74d055d94a7e1e97a50e43ffa83"} Apr 22 18:16:46.549558 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:46.549534 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" event={"ID":"9408a411-2453-4624-9714-942b060a03eb","Type":"ContainerStarted","Data":"c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e"} Apr 22 18:16:46.549669 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:46.549562 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" event={"ID":"9408a411-2453-4624-9714-942b060a03eb","Type":"ContainerStarted","Data":"4210f5a4859db8d13f4f7baf2971b9aeeafdce96d36ab16f7ee913475f3341d9"} Apr 22 18:16:47.555014 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:47.554933 2583 generic.go:358] "Generic (PLEG): container finished" podID="9408a411-2453-4624-9714-942b060a03eb" containerID="c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e" exitCode=0 Apr 22 18:16:47.555411 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:47.555027 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" event={"ID":"9408a411-2453-4624-9714-942b060a03eb","Type":"ContainerDied","Data":"c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e"} Apr 22 18:16:48.561952 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:48.561917 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" event={"ID":"9408a411-2453-4624-9714-942b060a03eb","Type":"ContainerStarted","Data":"041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461"} Apr 22 18:16:48.561952 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:48.561957 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" event={"ID":"9408a411-2453-4624-9714-942b060a03eb","Type":"ContainerStarted","Data":"933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b"} Apr 22 18:16:48.562393 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:48.562072 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:48.602929 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:48.602847 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" podStartSLOduration=3.60282904 podStartE2EDuration="3.60282904s" podCreationTimestamp="2026-04-22 18:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:16:48.601219074 +0000 UTC m=+1390.478488242" watchObservedRunningTime="2026-04-22 18:16:48.60282904 +0000 UTC m=+1390.480098209" Apr 22 18:16:50.572517 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:50.572478 2583 generic.go:358] "Generic (PLEG): container finished" podID="ad422670-1bc4-41e7-9c1c-221d8f956cf1" containerID="2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18" exitCode=0 Apr 22 18:16:50.572950 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:50.572555 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"ad422670-1bc4-41e7-9c1c-221d8f956cf1","Type":"ContainerDied","Data":"2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18"} Apr 22 18:16:56.034948 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:56.034906 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:56.035402 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:56.034972 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:16:56.036953 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:16:56.036468 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" podUID="9408a411-2453-4624-9714-942b060a03eb" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.38:8082/healthz\": dial tcp 10.134.0.38:8082: connect: connection refused" Apr 22 18:17:06.036477 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:17:06.036408 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:17:06.037887 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:17:06.037851 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:17:19.700509 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:17:19.700468 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"ad422670-1bc4-41e7-9c1c-221d8f956cf1","Type":"ContainerStarted","Data":"c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac"} Apr 22 18:17:19.724217 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:17:19.724161 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.736034164 podStartE2EDuration="34.724145224s" podCreationTimestamp="2026-04-22 18:16:45 +0000 UTC" firstStartedPulling="2026-04-22 18:16:50.573937728 +0000 UTC m=+1392.451206874" lastFinishedPulling="2026-04-22 18:17:18.562048785 +0000 UTC m=+1420.439317934" observedRunningTime="2026-04-22 18:17:19.721929032 +0000 UTC m=+1421.599198200" watchObservedRunningTime="2026-04-22 18:17:19.724145224 +0000 UTC m=+1421.601414392" Apr 22 18:17:26.651686 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:17:26.651596 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:18:31.644462 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.644428 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl"] Apr 22 18:18:31.647658 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.647608 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.650495 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.650468 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-rk4nx\"" Apr 22 18:18:31.650652 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.650518 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 18:18:31.662191 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.662159 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm"] Apr 22 18:18:31.665851 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.665817 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl"] Apr 22 18:18:31.666183 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.666154 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.680052 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.679940 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm"] Apr 22 18:18:31.683142 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.683113 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz8bp\" (UniqueName: \"kubernetes.io/projected/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kube-api-access-fz8bp\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.683286 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.683148 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-model-cache\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.683286 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.683175 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-dshm\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.683286 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.683254 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.683452 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.683301 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-home\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.683452 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.683407 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.784466 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.784420 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fz8bp\" (UniqueName: \"kubernetes.io/projected/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kube-api-access-fz8bp\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.784711 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.784472 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.784711 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.784507 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.784711 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.784537 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-model-cache\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.784711 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.784565 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-dshm\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.784711 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.784588 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.784711 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.784618 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.784711 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.784680 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.784711 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.784704 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzbzh\" (UniqueName: \"kubernetes.io/projected/55caa28d-fa87-4efe-9388-755f0be214cf-kube-api-access-lzbzh\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.785155 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.784737 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-home\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.785155 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.784845 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.785155 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.785034 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55caa28d-fa87-4efe-9388-755f0be214cf-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.785155 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.785094 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-model-cache\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.785155 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.785124 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-home\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.785480 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.785460 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.786999 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.786964 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-dshm\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.787296 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.787280 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.792701 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.792663 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz8bp\" (UniqueName: \"kubernetes.io/projected/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kube-api-access-fz8bp\") pod \"custom-route-timeout-pd-test-kserve-76c7464b76-945gl\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.886569 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.886536 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55caa28d-fa87-4efe-9388-755f0be214cf-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.886786 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.886591 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.886786 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.886636 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.886786 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.886671 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.887104 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.887074 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.887104 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.887091 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.887258 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.887124 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.887258 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.887133 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzbzh\" (UniqueName: \"kubernetes.io/projected/55caa28d-fa87-4efe-9388-755f0be214cf-kube-api-access-lzbzh\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.887258 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.887207 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.889366 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.889338 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.889795 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.889772 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55caa28d-fa87-4efe-9388-755f0be214cf-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.903081 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.903014 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzbzh\" (UniqueName: \"kubernetes.io/projected/55caa28d-fa87-4efe-9388-755f0be214cf-kube-api-access-lzbzh\") pod \"custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:31.963172 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.963135 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:31.984245 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:31.984208 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:32.117724 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:32.117689 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl"] Apr 22 18:18:32.120030 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:18:32.119990 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod962b327c_6e8d_45ec_8d36_6fcd15f6ea70.slice/crio-03a83463bad0f31ecd30ad2dbde195da92223763cdf1810fd47b900e0a4e41e6 WatchSource:0}: Error finding container 03a83463bad0f31ecd30ad2dbde195da92223763cdf1810fd47b900e0a4e41e6: Status 404 returned error can't find the container with id 03a83463bad0f31ecd30ad2dbde195da92223763cdf1810fd47b900e0a4e41e6 Apr 22 18:18:32.143963 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:32.143817 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm"] Apr 22 18:18:32.146808 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:18:32.146779 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55caa28d_fa87_4efe_9388_755f0be214cf.slice/crio-109d27ec090fbfb3f83365eddf98bd300a2359bdeea279044c00fe36fc7c6f5f WatchSource:0}: Error finding container 109d27ec090fbfb3f83365eddf98bd300a2359bdeea279044c00fe36fc7c6f5f: Status 404 returned error can't find the container with id 109d27ec090fbfb3f83365eddf98bd300a2359bdeea279044c00fe36fc7c6f5f Apr 22 18:18:32.964489 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:32.964449 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" event={"ID":"962b327c-6e8d-45ec-8d36-6fcd15f6ea70","Type":"ContainerStarted","Data":"03a83463bad0f31ecd30ad2dbde195da92223763cdf1810fd47b900e0a4e41e6"} Apr 22 18:18:32.966151 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:32.966119 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" event={"ID":"55caa28d-fa87-4efe-9388-755f0be214cf","Type":"ContainerStarted","Data":"0910636f0f69b061fa1cafdbde9da7f0d5e5c77884d25ff9665b6301c3c70f5e"} Apr 22 18:18:32.966307 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:32.966156 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" event={"ID":"55caa28d-fa87-4efe-9388-755f0be214cf","Type":"ContainerStarted","Data":"109d27ec090fbfb3f83365eddf98bd300a2359bdeea279044c00fe36fc7c6f5f"} Apr 22 18:18:33.971878 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:33.971833 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" event={"ID":"962b327c-6e8d-45ec-8d36-6fcd15f6ea70","Type":"ContainerStarted","Data":"3ae625e03c8218dceee4060887f76071c2f3778a5866e0bb620d6067af5aacdc"} Apr 22 18:18:34.979087 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:34.979049 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" event={"ID":"962b327c-6e8d-45ec-8d36-6fcd15f6ea70","Type":"ContainerStarted","Data":"00dce15103061c5ac069e62e0088281d34cbd4a7c88141afa38533f233ab762c"} Apr 22 18:18:34.979814 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:34.979332 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:36.990784 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:36.990749 2583 generic.go:358] "Generic (PLEG): container finished" podID="55caa28d-fa87-4efe-9388-755f0be214cf" containerID="0910636f0f69b061fa1cafdbde9da7f0d5e5c77884d25ff9665b6301c3c70f5e" exitCode=0 Apr 22 18:18:36.991156 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:36.990828 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" event={"ID":"55caa28d-fa87-4efe-9388-755f0be214cf","Type":"ContainerDied","Data":"0910636f0f69b061fa1cafdbde9da7f0d5e5c77884d25ff9665b6301c3c70f5e"} Apr 22 18:18:38.000933 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:38.000892 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" event={"ID":"55caa28d-fa87-4efe-9388-755f0be214cf","Type":"ContainerStarted","Data":"f0c9c7d9d18991e15624171f6a4edfedeac4fcc502eb6d1302904474d342f466"} Apr 22 18:18:38.029039 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:38.028973 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podStartSLOduration=7.028952422 podStartE2EDuration="7.028952422s" podCreationTimestamp="2026-04-22 18:18:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:18:38.025357513 +0000 UTC m=+1499.902626682" watchObservedRunningTime="2026-04-22 18:18:38.028952422 +0000 UTC m=+1499.906221617" Apr 22 18:18:38.673171 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:38.673139 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 18:18:38.674275 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:38.674251 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 18:18:39.008251 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:39.008208 2583 generic.go:358] "Generic (PLEG): container finished" podID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerID="00dce15103061c5ac069e62e0088281d34cbd4a7c88141afa38533f233ab762c" exitCode=0 Apr 22 18:18:39.008727 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:39.008284 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" event={"ID":"962b327c-6e8d-45ec-8d36-6fcd15f6ea70","Type":"ContainerDied","Data":"00dce15103061c5ac069e62e0088281d34cbd4a7c88141afa38533f233ab762c"} Apr 22 18:18:40.015065 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:40.015020 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" event={"ID":"962b327c-6e8d-45ec-8d36-6fcd15f6ea70","Type":"ContainerStarted","Data":"80cc6714af9a1de303b741bc5ea36759bceb0dbf14ca40408c643b07b2977631"} Apr 22 18:18:40.042585 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:40.042510 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podStartSLOduration=8.063862724 podStartE2EDuration="9.042489533s" podCreationTimestamp="2026-04-22 18:18:31 +0000 UTC" firstStartedPulling="2026-04-22 18:18:32.122694558 +0000 UTC m=+1493.999963708" lastFinishedPulling="2026-04-22 18:18:33.101321354 +0000 UTC m=+1494.978590517" observedRunningTime="2026-04-22 18:18:40.039209568 +0000 UTC m=+1501.916478736" watchObservedRunningTime="2026-04-22 18:18:40.042489533 +0000 UTC m=+1501.919758703" Apr 22 18:18:41.964206 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:41.964165 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:41.964690 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:41.964219 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:41.965937 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:41.965898 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:18:41.985038 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:41.985010 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:41.985230 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:41.985194 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:18:41.986823 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:41.986780 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:18:51.964296 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:51.964193 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:18:51.991151 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:51.977537 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:18:51.991151 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:51.985072 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:18:53.110760 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:53.110721 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd"] Apr 22 18:18:53.111730 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:53.111659 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" podUID="9408a411-2453-4624-9714-942b060a03eb" containerName="main" containerID="cri-o://933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b" gracePeriod=30 Apr 22 18:18:53.112120 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:53.112087 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" podUID="9408a411-2453-4624-9714-942b060a03eb" containerName="tokenizer" containerID="cri-o://041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461" gracePeriod=30 Apr 22 18:18:54.087773 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.087724 2583 generic.go:358] "Generic (PLEG): container finished" podID="9408a411-2453-4624-9714-942b060a03eb" containerID="933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b" exitCode=0 Apr 22 18:18:54.087773 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.087753 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" event={"ID":"9408a411-2453-4624-9714-942b060a03eb","Type":"ContainerDied","Data":"933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b"} Apr 22 18:18:54.518313 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.518202 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:18:54.520243 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.520217 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:18:54.520487 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.520465 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="ad422670-1bc4-41e7-9c1c-221d8f956cf1" containerName="main" containerID="cri-o://c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac" gracePeriod=30 Apr 22 18:18:54.634541 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.634454 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnhvs\" (UniqueName: \"kubernetes.io/projected/9408a411-2453-4624-9714-942b060a03eb-kube-api-access-cnhvs\") pod \"9408a411-2453-4624-9714-942b060a03eb\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " Apr 22 18:18:54.634738 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.634570 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-uds\") pod \"9408a411-2453-4624-9714-942b060a03eb\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " Apr 22 18:18:54.634738 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.634594 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-cache\") pod \"9408a411-2453-4624-9714-942b060a03eb\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " Apr 22 18:18:54.634738 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.634688 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-tmp\") pod \"9408a411-2453-4624-9714-942b060a03eb\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " Apr 22 18:18:54.634908 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.634738 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9408a411-2453-4624-9714-942b060a03eb-tls-certs\") pod \"9408a411-2453-4624-9714-942b060a03eb\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " Apr 22 18:18:54.634908 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.634766 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-kserve-provision-location\") pod \"9408a411-2453-4624-9714-942b060a03eb\" (UID: \"9408a411-2453-4624-9714-942b060a03eb\") " Apr 22 18:18:54.635003 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.634893 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "9408a411-2453-4624-9714-942b060a03eb" (UID: "9408a411-2453-4624-9714-942b060a03eb"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:54.635136 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.635102 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-uds\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:54.635216 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.635155 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "9408a411-2453-4624-9714-942b060a03eb" (UID: "9408a411-2453-4624-9714-942b060a03eb"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:54.635356 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.635332 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "9408a411-2453-4624-9714-942b060a03eb" (UID: "9408a411-2453-4624-9714-942b060a03eb"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:54.635706 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.635681 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9408a411-2453-4624-9714-942b060a03eb" (UID: "9408a411-2453-4624-9714-942b060a03eb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:54.637649 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.637601 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9408a411-2453-4624-9714-942b060a03eb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9408a411-2453-4624-9714-942b060a03eb" (UID: "9408a411-2453-4624-9714-942b060a03eb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:18:54.637762 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.637743 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9408a411-2453-4624-9714-942b060a03eb-kube-api-access-cnhvs" (OuterVolumeSpecName: "kube-api-access-cnhvs") pod "9408a411-2453-4624-9714-942b060a03eb" (UID: "9408a411-2453-4624-9714-942b060a03eb"). InnerVolumeSpecName "kube-api-access-cnhvs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:18:54.736349 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.736311 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-cache\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:54.736349 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.736343 2583 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-tokenizer-tmp\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:54.736349 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.736356 2583 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9408a411-2453-4624-9714-942b060a03eb-tls-certs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:54.736585 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.736369 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9408a411-2453-4624-9714-942b060a03eb-kserve-provision-location\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:54.736585 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:54.736384 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cnhvs\" (UniqueName: \"kubernetes.io/projected/9408a411-2453-4624-9714-942b060a03eb-kube-api-access-cnhvs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:55.094723 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.094684 2583 generic.go:358] "Generic (PLEG): container finished" podID="9408a411-2453-4624-9714-942b060a03eb" containerID="041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461" exitCode=0 Apr 22 18:18:55.094920 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.094769 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" Apr 22 18:18:55.094920 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.094767 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" event={"ID":"9408a411-2453-4624-9714-942b060a03eb","Type":"ContainerDied","Data":"041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461"} Apr 22 18:18:55.094920 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.094843 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd" event={"ID":"9408a411-2453-4624-9714-942b060a03eb","Type":"ContainerDied","Data":"4210f5a4859db8d13f4f7baf2971b9aeeafdce96d36ab16f7ee913475f3341d9"} Apr 22 18:18:55.094920 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.094870 2583 scope.go:117] "RemoveContainer" containerID="041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461" Apr 22 18:18:55.105280 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.105263 2583 scope.go:117] "RemoveContainer" containerID="933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b" Apr 22 18:18:55.114873 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.114852 2583 scope.go:117] "RemoveContainer" containerID="c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e" Apr 22 18:18:55.117444 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.117412 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd"] Apr 22 18:18:55.122389 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.122366 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-sche4sctd"] Apr 22 18:18:55.124615 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.124594 2583 scope.go:117] "RemoveContainer" containerID="041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461" Apr 22 18:18:55.125007 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:18:55.124983 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461\": container with ID starting with 041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461 not found: ID does not exist" containerID="041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461" Apr 22 18:18:55.125078 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.125020 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461"} err="failed to get container status \"041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461\": rpc error: code = NotFound desc = could not find container \"041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461\": container with ID starting with 041c74210a333fa58dbc2952f2cf31813d78fe409c8649cf7cc3cbfdf145c461 not found: ID does not exist" Apr 22 18:18:55.125078 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.125046 2583 scope.go:117] "RemoveContainer" containerID="933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b" Apr 22 18:18:55.125334 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:18:55.125309 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b\": container with ID starting with 933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b not found: ID does not exist" containerID="933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b" Apr 22 18:18:55.125455 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.125341 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b"} err="failed to get container status \"933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b\": rpc error: code = NotFound desc = could not find container \"933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b\": container with ID starting with 933e1b57ce2250d9385b2c293ab6bfeb45a9e72ff37de2e2a468ef4b90fbad4b not found: ID does not exist" Apr 22 18:18:55.125455 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.125360 2583 scope.go:117] "RemoveContainer" containerID="c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e" Apr 22 18:18:55.125657 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:18:55.125613 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e\": container with ID starting with c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e not found: ID does not exist" containerID="c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e" Apr 22 18:18:55.125737 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.125662 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e"} err="failed to get container status \"c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e\": rpc error: code = NotFound desc = could not find container \"c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e\": container with ID starting with c4da743567e0c2e62e0354a31925682df9d458562bd38425d0d673dba90d332e not found: ID does not exist" Apr 22 18:18:55.450800 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.450768 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:18:55.546471 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.546439 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn4qc\" (UniqueName: \"kubernetes.io/projected/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kube-api-access-zn4qc\") pod \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " Apr 22 18:18:55.546964 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.546540 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-dshm\") pod \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " Apr 22 18:18:55.546964 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.546571 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-home\") pod \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " Apr 22 18:18:55.546964 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.546596 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-model-cache\") pod \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " Apr 22 18:18:55.546964 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.546635 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad422670-1bc4-41e7-9c1c-221d8f956cf1-tls-certs\") pod \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " Apr 22 18:18:55.546964 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.546660 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kserve-provision-location\") pod \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\" (UID: \"ad422670-1bc4-41e7-9c1c-221d8f956cf1\") " Apr 22 18:18:55.547302 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.546978 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-model-cache" (OuterVolumeSpecName: "model-cache") pod "ad422670-1bc4-41e7-9c1c-221d8f956cf1" (UID: "ad422670-1bc4-41e7-9c1c-221d8f956cf1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:55.547302 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.547060 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-home" (OuterVolumeSpecName: "home") pod "ad422670-1bc4-41e7-9c1c-221d8f956cf1" (UID: "ad422670-1bc4-41e7-9c1c-221d8f956cf1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:55.549002 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.548958 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad422670-1bc4-41e7-9c1c-221d8f956cf1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ad422670-1bc4-41e7-9c1c-221d8f956cf1" (UID: "ad422670-1bc4-41e7-9c1c-221d8f956cf1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:18:55.549863 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.549828 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-dshm" (OuterVolumeSpecName: "dshm") pod "ad422670-1bc4-41e7-9c1c-221d8f956cf1" (UID: "ad422670-1bc4-41e7-9c1c-221d8f956cf1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:55.549985 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.549951 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kube-api-access-zn4qc" (OuterVolumeSpecName: "kube-api-access-zn4qc") pod "ad422670-1bc4-41e7-9c1c-221d8f956cf1" (UID: "ad422670-1bc4-41e7-9c1c-221d8f956cf1"). InnerVolumeSpecName "kube-api-access-zn4qc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:18:55.605028 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.604974 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ad422670-1bc4-41e7-9c1c-221d8f956cf1" (UID: "ad422670-1bc4-41e7-9c1c-221d8f956cf1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:55.647597 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.647554 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zn4qc\" (UniqueName: \"kubernetes.io/projected/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kube-api-access-zn4qc\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:55.647597 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.647590 2583 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-dshm\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:55.647597 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.647603 2583 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-home\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:55.647859 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.647615 2583 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-model-cache\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:55.647859 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.647642 2583 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad422670-1bc4-41e7-9c1c-221d8f956cf1-tls-certs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:55.647859 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:55.647654 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad422670-1bc4-41e7-9c1c-221d8f956cf1-kserve-provision-location\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:18:56.101154 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.101112 2583 generic.go:358] "Generic (PLEG): container finished" podID="ad422670-1bc4-41e7-9c1c-221d8f956cf1" containerID="c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac" exitCode=0 Apr 22 18:18:56.101364 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.101185 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 18:18:56.101364 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.101191 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"ad422670-1bc4-41e7-9c1c-221d8f956cf1","Type":"ContainerDied","Data":"c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac"} Apr 22 18:18:56.101364 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.101228 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"ad422670-1bc4-41e7-9c1c-221d8f956cf1","Type":"ContainerDied","Data":"21ccb98e906f2dfb14c39578a6dc0453852af74d055d94a7e1e97a50e43ffa83"} Apr 22 18:18:56.101364 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.101243 2583 scope.go:117] "RemoveContainer" containerID="c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac" Apr 22 18:18:56.120183 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.120160 2583 scope.go:117] "RemoveContainer" containerID="2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18" Apr 22 18:18:56.126380 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.126352 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:18:56.131140 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.131113 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 18:18:56.184001 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.183975 2583 scope.go:117] "RemoveContainer" containerID="c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac" Apr 22 18:18:56.184354 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:18:56.184324 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac\": container with ID starting with c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac not found: ID does not exist" containerID="c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac" Apr 22 18:18:56.184482 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.184362 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac"} err="failed to get container status \"c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac\": rpc error: code = NotFound desc = could not find container \"c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac\": container with ID starting with c7ccf69d9811e1e47714ffab30f689598c6e25a7ebdeea5e75b8182b2f5a19ac not found: ID does not exist" Apr 22 18:18:56.184482 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.184384 2583 scope.go:117] "RemoveContainer" containerID="2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18" Apr 22 18:18:56.184752 ip-10-0-131-69 kubenswrapper[2583]: E0422 18:18:56.184721 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18\": container with ID starting with 2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18 not found: ID does not exist" containerID="2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18" Apr 22 18:18:56.184845 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.184766 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18"} err="failed to get container status \"2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18\": rpc error: code = NotFound desc = could not find container \"2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18\": container with ID starting with 2d6a609fd850923801cdb3734a4fe40d13eccc735b4e902c2bdc547ac9055a18 not found: ID does not exist" Apr 22 18:18:56.666512 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.666476 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9408a411-2453-4624-9714-942b060a03eb" path="/var/lib/kubelet/pods/9408a411-2453-4624-9714-942b060a03eb/volumes" Apr 22 18:18:56.666983 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:18:56.666951 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad422670-1bc4-41e7-9c1c-221d8f956cf1" path="/var/lib/kubelet/pods/ad422670-1bc4-41e7-9c1c-221d8f956cf1/volumes" Apr 22 18:19:01.964363 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:01.964317 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:19:01.985564 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:01.985521 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:19:11.964044 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:11.963983 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:19:11.984965 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:11.984921 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:19:21.964054 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:21.964004 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:19:21.985074 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:21.985036 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:19:31.964210 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:31.964161 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:19:31.984995 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:31.984958 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:19:41.964583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:41.964536 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:19:41.985241 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:41.985202 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:19:51.963864 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:51.963812 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:19:51.985024 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:19:51.984981 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:20:01.964429 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:01.964386 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:20:01.985417 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:01.985378 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:20:11.964075 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:11.964034 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:20:11.985563 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:11.985515 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:20:21.963822 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:21.963728 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:20:21.985022 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:21.984984 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:20:31.963581 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:31.963543 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:20:31.984612 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:31.984576 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:20:41.963752 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:41.963705 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:20:41.984916 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:41.984877 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 22 18:20:51.963849 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:51.963802 2583 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8001/health\": dial tcp 10.134.0.39:8001: connect: connection refused" Apr 22 18:20:51.994500 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:51.994467 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:20:52.002137 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:20:52.002109 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:21:01.974496 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:01.974461 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:21:01.986537 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:01.986511 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:21:14.570172 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:14.570129 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm"] Apr 22 18:21:14.570725 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:14.570459 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" containerID="cri-o://f0c9c7d9d18991e15624171f6a4edfedeac4fcc502eb6d1302904474d342f466" gracePeriod=30 Apr 22 18:21:14.581275 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:14.581244 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl"] Apr 22 18:21:14.581683 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:14.581605 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" containerID="cri-o://80cc6714af9a1de303b741bc5ea36759bceb0dbf14ca40408c643b07b2977631" gracePeriod=30 Apr 22 18:21:44.581876 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.581811 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="llm-d-routing-sidecar" containerID="cri-o://3ae625e03c8218dceee4060887f76071c2f3778a5866e0bb620d6067af5aacdc" gracePeriod=2 Apr 22 18:21:44.744501 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.744480 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-76c7464b76-945gl_962b327c-6e8d-45ec-8d36-6fcd15f6ea70/main/0.log" Apr 22 18:21:44.745141 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.745115 2583 generic.go:358] "Generic (PLEG): container finished" podID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerID="80cc6714af9a1de303b741bc5ea36759bceb0dbf14ca40408c643b07b2977631" exitCode=137 Apr 22 18:21:44.745141 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.745138 2583 generic.go:358] "Generic (PLEG): container finished" podID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerID="3ae625e03c8218dceee4060887f76071c2f3778a5866e0bb620d6067af5aacdc" exitCode=0 Apr 22 18:21:44.745297 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.745208 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" event={"ID":"962b327c-6e8d-45ec-8d36-6fcd15f6ea70","Type":"ContainerDied","Data":"80cc6714af9a1de303b741bc5ea36759bceb0dbf14ca40408c643b07b2977631"} Apr 22 18:21:44.745297 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.745232 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" event={"ID":"962b327c-6e8d-45ec-8d36-6fcd15f6ea70","Type":"ContainerDied","Data":"3ae625e03c8218dceee4060887f76071c2f3778a5866e0bb620d6067af5aacdc"} Apr 22 18:21:44.747044 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.747005 2583 generic.go:358] "Generic (PLEG): container finished" podID="55caa28d-fa87-4efe-9388-755f0be214cf" containerID="f0c9c7d9d18991e15624171f6a4edfedeac4fcc502eb6d1302904474d342f466" exitCode=137 Apr 22 18:21:44.747170 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.747064 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" event={"ID":"55caa28d-fa87-4efe-9388-755f0be214cf","Type":"ContainerDied","Data":"f0c9c7d9d18991e15624171f6a4edfedeac4fcc502eb6d1302904474d342f466"} Apr 22 18:21:44.870012 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.869983 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:21:44.878319 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.878301 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-76c7464b76-945gl_962b327c-6e8d-45ec-8d36-6fcd15f6ea70/main/0.log" Apr 22 18:21:44.878971 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.878951 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:21:44.962480 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.962445 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzbzh\" (UniqueName: \"kubernetes.io/projected/55caa28d-fa87-4efe-9388-755f0be214cf-kube-api-access-lzbzh\") pod \"55caa28d-fa87-4efe-9388-755f0be214cf\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " Apr 22 18:21:44.962480 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.962484 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-tls-certs\") pod \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " Apr 22 18:21:44.962746 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.962529 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-model-cache\") pod \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " Apr 22 18:21:44.962746 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.962568 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kserve-provision-location\") pod \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " Apr 22 18:21:44.962746 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.962598 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz8bp\" (UniqueName: \"kubernetes.io/projected/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kube-api-access-fz8bp\") pod \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " Apr 22 18:21:44.962746 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.962638 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-dshm\") pod \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " Apr 22 18:21:44.962746 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.962662 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-home\") pod \"55caa28d-fa87-4efe-9388-755f0be214cf\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " Apr 22 18:21:44.963008 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.962842 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-model-cache" (OuterVolumeSpecName: "model-cache") pod "962b327c-6e8d-45ec-8d36-6fcd15f6ea70" (UID: "962b327c-6e8d-45ec-8d36-6fcd15f6ea70"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:44.963068 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.963008 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-model-cache\") pod \"55caa28d-fa87-4efe-9388-755f0be214cf\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " Apr 22 18:21:44.963129 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.963071 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55caa28d-fa87-4efe-9388-755f0be214cf-tls-certs\") pod \"55caa28d-fa87-4efe-9388-755f0be214cf\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " Apr 22 18:21:44.963129 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.963010 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-home" (OuterVolumeSpecName: "home") pod "55caa28d-fa87-4efe-9388-755f0be214cf" (UID: "55caa28d-fa87-4efe-9388-755f0be214cf"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:44.963457 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.963171 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-dshm\") pod \"55caa28d-fa87-4efe-9388-755f0be214cf\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " Apr 22 18:21:44.963545 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.963487 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-home\") pod \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\" (UID: \"962b327c-6e8d-45ec-8d36-6fcd15f6ea70\") " Apr 22 18:21:44.963611 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.963561 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-kserve-provision-location\") pod \"55caa28d-fa87-4efe-9388-755f0be214cf\" (UID: \"55caa28d-fa87-4efe-9388-755f0be214cf\") " Apr 22 18:21:44.963699 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.963671 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-model-cache" (OuterVolumeSpecName: "model-cache") pod "55caa28d-fa87-4efe-9388-755f0be214cf" (UID: "55caa28d-fa87-4efe-9388-755f0be214cf"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:44.963982 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.963952 2583 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-model-cache\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:44.963982 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.963978 2583 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-home\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:44.964130 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.963992 2583 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-model-cache\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:44.964130 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.964099 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-home" (OuterVolumeSpecName: "home") pod "962b327c-6e8d-45ec-8d36-6fcd15f6ea70" (UID: "962b327c-6e8d-45ec-8d36-6fcd15f6ea70"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:44.964967 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.964941 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-dshm" (OuterVolumeSpecName: "dshm") pod "962b327c-6e8d-45ec-8d36-6fcd15f6ea70" (UID: "962b327c-6e8d-45ec-8d36-6fcd15f6ea70"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:44.965088 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.964997 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55caa28d-fa87-4efe-9388-755f0be214cf-kube-api-access-lzbzh" (OuterVolumeSpecName: "kube-api-access-lzbzh") pod "55caa28d-fa87-4efe-9388-755f0be214cf" (UID: "55caa28d-fa87-4efe-9388-755f0be214cf"). InnerVolumeSpecName "kube-api-access-lzbzh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:21:44.965533 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.965508 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "962b327c-6e8d-45ec-8d36-6fcd15f6ea70" (UID: "962b327c-6e8d-45ec-8d36-6fcd15f6ea70"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:21:44.965533 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.965523 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55caa28d-fa87-4efe-9388-755f0be214cf-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "55caa28d-fa87-4efe-9388-755f0be214cf" (UID: "55caa28d-fa87-4efe-9388-755f0be214cf"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:21:44.966371 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.966349 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kube-api-access-fz8bp" (OuterVolumeSpecName: "kube-api-access-fz8bp") pod "962b327c-6e8d-45ec-8d36-6fcd15f6ea70" (UID: "962b327c-6e8d-45ec-8d36-6fcd15f6ea70"). InnerVolumeSpecName "kube-api-access-fz8bp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:21:44.966569 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:44.966554 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-dshm" (OuterVolumeSpecName: "dshm") pod "55caa28d-fa87-4efe-9388-755f0be214cf" (UID: "55caa28d-fa87-4efe-9388-755f0be214cf"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:45.034917 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.034882 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "962b327c-6e8d-45ec-8d36-6fcd15f6ea70" (UID: "962b327c-6e8d-45ec-8d36-6fcd15f6ea70"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:45.035298 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.035275 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "55caa28d-fa87-4efe-9388-755f0be214cf" (UID: "55caa28d-fa87-4efe-9388-755f0be214cf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:21:45.065278 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.065252 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-kserve-provision-location\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:45.065278 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.065275 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lzbzh\" (UniqueName: \"kubernetes.io/projected/55caa28d-fa87-4efe-9388-755f0be214cf-kube-api-access-lzbzh\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:45.065429 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.065287 2583 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-tls-certs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:45.065429 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.065299 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kserve-provision-location\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:45.065429 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.065307 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fz8bp\" (UniqueName: \"kubernetes.io/projected/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-kube-api-access-fz8bp\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:45.065429 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.065315 2583 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-dshm\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:45.065429 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.065324 2583 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55caa28d-fa87-4efe-9388-755f0be214cf-tls-certs\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:45.065429 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.065332 2583 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55caa28d-fa87-4efe-9388-755f0be214cf-dshm\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:45.065429 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.065340 2583 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/962b327c-6e8d-45ec-8d36-6fcd15f6ea70-home\") on node \"ip-10-0-131-69.ec2.internal\" DevicePath \"\"" Apr 22 18:21:45.751823 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.751792 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-76c7464b76-945gl_962b327c-6e8d-45ec-8d36-6fcd15f6ea70/main/0.log" Apr 22 18:21:45.752523 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.752500 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" event={"ID":"962b327c-6e8d-45ec-8d36-6fcd15f6ea70","Type":"ContainerDied","Data":"03a83463bad0f31ecd30ad2dbde195da92223763cdf1810fd47b900e0a4e41e6"} Apr 22 18:21:45.752588 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.752548 2583 scope.go:117] "RemoveContainer" containerID="80cc6714af9a1de303b741bc5ea36759bceb0dbf14ca40408c643b07b2977631" Apr 22 18:21:45.752646 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.752546 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl" Apr 22 18:21:45.753986 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.753961 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" event={"ID":"55caa28d-fa87-4efe-9388-755f0be214cf","Type":"ContainerDied","Data":"109d27ec090fbfb3f83365eddf98bd300a2359bdeea279044c00fe36fc7c6f5f"} Apr 22 18:21:45.754111 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.754092 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm" Apr 22 18:21:45.773570 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.773550 2583 scope.go:117] "RemoveContainer" containerID="00dce15103061c5ac069e62e0088281d34cbd4a7c88141afa38533f233ab762c" Apr 22 18:21:45.781128 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.781105 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl"] Apr 22 18:21:45.785344 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.785179 2583 scope.go:117] "RemoveContainer" containerID="3ae625e03c8218dceee4060887f76071c2f3778a5866e0bb620d6067af5aacdc" Apr 22 18:21:45.787665 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.787642 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-76c7464b76-945gl"] Apr 22 18:21:45.795782 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.795760 2583 scope.go:117] "RemoveContainer" containerID="f0c9c7d9d18991e15624171f6a4edfedeac4fcc502eb6d1302904474d342f466" Apr 22 18:21:45.804956 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.804935 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm"] Apr 22 18:21:45.809157 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.809135 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cbdcff74f-j5fdm"] Apr 22 18:21:45.815767 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:45.815745 2583 scope.go:117] "RemoveContainer" containerID="0910636f0f69b061fa1cafdbde9da7f0d5e5c77884d25ff9665b6301c3c70f5e" Apr 22 18:21:46.666506 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:46.666468 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" path="/var/lib/kubelet/pods/55caa28d-fa87-4efe-9388-755f0be214cf/volumes" Apr 22 18:21:46.666929 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:21:46.666915 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" path="/var/lib/kubelet/pods/962b327c-6e8d-45ec-8d36-6fcd15f6ea70/volumes" Apr 22 18:23:38.698866 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:23:38.698832 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 18:23:38.701282 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:23:38.701261 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 18:24:44.517670 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:44.517636 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-4nqhm_c122c782-f623-45a9-b000-e3436d9bc99f/manager/0.log" Apr 22 18:24:46.983496 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983461 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kx68p/must-gather-vk4sx"] Apr 22 18:24:46.983881 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983856 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad422670-1bc4-41e7-9c1c-221d8f956cf1" containerName="storage-initializer" Apr 22 18:24:46.983881 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983869 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad422670-1bc4-41e7-9c1c-221d8f956cf1" containerName="storage-initializer" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983887 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9408a411-2453-4624-9714-942b060a03eb" containerName="storage-initializer" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983893 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9408a411-2453-4624-9714-942b060a03eb" containerName="storage-initializer" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983899 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9408a411-2453-4624-9714-942b060a03eb" containerName="tokenizer" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983905 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9408a411-2453-4624-9714-942b060a03eb" containerName="tokenizer" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983911 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad422670-1bc4-41e7-9c1c-221d8f956cf1" containerName="main" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983917 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad422670-1bc4-41e7-9c1c-221d8f956cf1" containerName="main" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983922 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983927 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983937 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="llm-d-routing-sidecar" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983942 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="llm-d-routing-sidecar" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983950 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9408a411-2453-4624-9714-942b060a03eb" containerName="main" Apr 22 18:24:46.983954 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983954 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9408a411-2453-4624-9714-942b060a03eb" containerName="main" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983960 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="storage-initializer" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983966 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="storage-initializer" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983973 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="storage-initializer" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983978 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="storage-initializer" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983984 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.983988 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.984042 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad422670-1bc4-41e7-9c1c-221d8f956cf1" containerName="main" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.984051 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="llm-d-routing-sidecar" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.984058 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="9408a411-2453-4624-9714-942b060a03eb" containerName="tokenizer" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.984067 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="9408a411-2453-4624-9714-942b060a03eb" containerName="main" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.984073 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="962b327c-6e8d-45ec-8d36-6fcd15f6ea70" containerName="main" Apr 22 18:24:46.984304 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.984080 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="55caa28d-fa87-4efe-9388-755f0be214cf" containerName="main" Apr 22 18:24:46.987058 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.987039 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx68p/must-gather-vk4sx" Apr 22 18:24:46.989933 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.989899 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kx68p\"/\"default-dockercfg-c5qsr\"" Apr 22 18:24:46.990173 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.990148 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kx68p\"/\"openshift-service-ca.crt\"" Apr 22 18:24:46.990288 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.990170 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kx68p\"/\"kube-root-ca.crt\"" Apr 22 18:24:46.995890 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:46.995865 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kx68p/must-gather-vk4sx"] Apr 22 18:24:47.057960 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:47.057931 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr286\" (UniqueName: \"kubernetes.io/projected/b8a304d4-0db9-4aa6-8b3c-8beb85463e2d-kube-api-access-mr286\") pod \"must-gather-vk4sx\" (UID: \"b8a304d4-0db9-4aa6-8b3c-8beb85463e2d\") " pod="openshift-must-gather-kx68p/must-gather-vk4sx" Apr 22 18:24:47.058107 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:47.057978 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8a304d4-0db9-4aa6-8b3c-8beb85463e2d-must-gather-output\") pod \"must-gather-vk4sx\" (UID: \"b8a304d4-0db9-4aa6-8b3c-8beb85463e2d\") " pod="openshift-must-gather-kx68p/must-gather-vk4sx" Apr 22 18:24:47.159329 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:47.159284 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mr286\" (UniqueName: \"kubernetes.io/projected/b8a304d4-0db9-4aa6-8b3c-8beb85463e2d-kube-api-access-mr286\") pod \"must-gather-vk4sx\" (UID: \"b8a304d4-0db9-4aa6-8b3c-8beb85463e2d\") " pod="openshift-must-gather-kx68p/must-gather-vk4sx" Apr 22 18:24:47.159507 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:47.159348 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8a304d4-0db9-4aa6-8b3c-8beb85463e2d-must-gather-output\") pod \"must-gather-vk4sx\" (UID: \"b8a304d4-0db9-4aa6-8b3c-8beb85463e2d\") " pod="openshift-must-gather-kx68p/must-gather-vk4sx" Apr 22 18:24:47.159724 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:47.159708 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8a304d4-0db9-4aa6-8b3c-8beb85463e2d-must-gather-output\") pod \"must-gather-vk4sx\" (UID: \"b8a304d4-0db9-4aa6-8b3c-8beb85463e2d\") " pod="openshift-must-gather-kx68p/must-gather-vk4sx" Apr 22 18:24:47.170396 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:47.170367 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr286\" (UniqueName: \"kubernetes.io/projected/b8a304d4-0db9-4aa6-8b3c-8beb85463e2d-kube-api-access-mr286\") pod \"must-gather-vk4sx\" (UID: \"b8a304d4-0db9-4aa6-8b3c-8beb85463e2d\") " pod="openshift-must-gather-kx68p/must-gather-vk4sx" Apr 22 18:24:47.297613 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:47.297515 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx68p/must-gather-vk4sx" Apr 22 18:24:47.467858 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:47.467820 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kx68p/must-gather-vk4sx"] Apr 22 18:24:47.469636 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:24:47.469583 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8a304d4_0db9_4aa6_8b3c_8beb85463e2d.slice/crio-70107dad9a81fd31b5a829c226bdbc52ccbe4d84bea4ed6568277d2fc65344b0 WatchSource:0}: Error finding container 70107dad9a81fd31b5a829c226bdbc52ccbe4d84bea4ed6568277d2fc65344b0: Status 404 returned error can't find the container with id 70107dad9a81fd31b5a829c226bdbc52ccbe4d84bea4ed6568277d2fc65344b0 Apr 22 18:24:47.471428 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:47.471402 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:24:48.380358 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:48.380315 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx68p/must-gather-vk4sx" event={"ID":"b8a304d4-0db9-4aa6-8b3c-8beb85463e2d","Type":"ContainerStarted","Data":"70107dad9a81fd31b5a829c226bdbc52ccbe4d84bea4ed6568277d2fc65344b0"} Apr 22 18:24:49.388144 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:49.388100 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx68p/must-gather-vk4sx" event={"ID":"b8a304d4-0db9-4aa6-8b3c-8beb85463e2d","Type":"ContainerStarted","Data":"c21b145910252789ea1fc9b7edd3c426e9f1e8507d8f0c1d374b6474df42cfd1"} Apr 22 18:24:49.388144 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:49.388145 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx68p/must-gather-vk4sx" event={"ID":"b8a304d4-0db9-4aa6-8b3c-8beb85463e2d","Type":"ContainerStarted","Data":"e9a0ce752666281b72595e21a6b5d739857aee373205acf7f7aba18eb7e5f2cc"} Apr 22 18:24:49.868525 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:49.868490 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rzw2s_8b464d48-2213-403b-824b-b5e2c32387b5/global-pull-secret-syncer/0.log" Apr 22 18:24:49.979984 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:49.979957 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xj964_661612b0-cef9-4d3b-ad68-e9507dc62d38/konnectivity-agent/0.log" Apr 22 18:24:50.020960 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:50.020933 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-69.ec2.internal_5240e71df4ae79d7d950c1a36bd685b5/haproxy/0.log" Apr 22 18:24:54.030704 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:54.030669 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-4nqhm_c122c782-f623-45a9-b000-e3436d9bc99f/manager/0.log" Apr 22 18:24:55.177328 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.177297 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_701fe7be-306c-47eb-b241-4cf8a0e06584/alertmanager/0.log" Apr 22 18:24:55.203215 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.203063 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_701fe7be-306c-47eb-b241-4cf8a0e06584/config-reloader/0.log" Apr 22 18:24:55.227034 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.226883 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_701fe7be-306c-47eb-b241-4cf8a0e06584/kube-rbac-proxy-web/0.log" Apr 22 18:24:55.244912 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.244886 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_701fe7be-306c-47eb-b241-4cf8a0e06584/kube-rbac-proxy/0.log" Apr 22 18:24:55.265792 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.265751 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_701fe7be-306c-47eb-b241-4cf8a0e06584/kube-rbac-proxy-metric/0.log" Apr 22 18:24:55.281908 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.281881 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_701fe7be-306c-47eb-b241-4cf8a0e06584/prom-label-proxy/0.log" Apr 22 18:24:55.298744 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.298717 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_701fe7be-306c-47eb-b241-4cf8a0e06584/init-config-reloader/0.log" Apr 22 18:24:55.361195 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.361119 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-98j6g_c6252ec2-1bae-4d24-83c8-f43a6bdb5885/kube-state-metrics/0.log" Apr 22 18:24:55.377917 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.377889 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-98j6g_c6252ec2-1bae-4d24-83c8-f43a6bdb5885/kube-rbac-proxy-main/0.log" Apr 22 18:24:55.393500 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.393455 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-98j6g_c6252ec2-1bae-4d24-83c8-f43a6bdb5885/kube-rbac-proxy-self/0.log" Apr 22 18:24:55.418430 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.418351 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7dbd6fccb-srbbj_db7aa446-cb16-4cf3-8ab1-3215184cf20c/metrics-server/0.log" Apr 22 18:24:55.436413 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.436384 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-7mhlw_e043748c-8906-4b65-9b1c-53c110a7b404/monitoring-plugin/0.log" Apr 22 18:24:55.459528 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.459486 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-464cb_03bd3116-7dba-4d28-b3ca-6b85602e0bf2/node-exporter/0.log" Apr 22 18:24:55.477992 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.477962 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-464cb_03bd3116-7dba-4d28-b3ca-6b85602e0bf2/kube-rbac-proxy/0.log" Apr 22 18:24:55.493516 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.493482 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-464cb_03bd3116-7dba-4d28-b3ca-6b85602e0bf2/init-textfile/0.log" Apr 22 18:24:55.759258 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.759225 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dr4qw_fb50ea50-20b2-41eb-8fc1-e4b99b72dee3/kube-rbac-proxy-main/0.log" Apr 22 18:24:55.790100 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.790055 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dr4qw_fb50ea50-20b2-41eb-8fc1-e4b99b72dee3/kube-rbac-proxy-self/0.log" Apr 22 18:24:55.813226 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.813179 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dr4qw_fb50ea50-20b2-41eb-8fc1-e4b99b72dee3/openshift-state-metrics/0.log" Apr 22 18:24:55.881020 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.880897 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b0115df0-282f-469a-afef-106d26ba3616/prometheus/0.log" Apr 22 18:24:55.897496 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.897468 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b0115df0-282f-469a-afef-106d26ba3616/config-reloader/0.log" Apr 22 18:24:55.925688 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.925284 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b0115df0-282f-469a-afef-106d26ba3616/thanos-sidecar/0.log" Apr 22 18:24:55.966388 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.966355 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b0115df0-282f-469a-afef-106d26ba3616/kube-rbac-proxy-web/0.log" Apr 22 18:24:55.984617 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:55.984585 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b0115df0-282f-469a-afef-106d26ba3616/kube-rbac-proxy/0.log" Apr 22 18:24:56.002663 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:56.002606 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b0115df0-282f-469a-afef-106d26ba3616/kube-rbac-proxy-thanos/0.log" Apr 22 18:24:56.024469 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:56.024441 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b0115df0-282f-469a-afef-106d26ba3616/init-config-reloader/0.log" Apr 22 18:24:56.148544 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:56.148463 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c4f7df48c-7jl9h_c44d7ace-2367-4456-b866-2a706fd03e27/telemeter-client/0.log" Apr 22 18:24:56.168555 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:56.168505 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c4f7df48c-7jl9h_c44d7ace-2367-4456-b866-2a706fd03e27/reload/0.log" Apr 22 18:24:56.189250 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:56.189198 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c4f7df48c-7jl9h_c44d7ace-2367-4456-b866-2a706fd03e27/kube-rbac-proxy/0.log" Apr 22 18:24:56.218231 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:56.218197 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76d946b44c-6cv98_50b8ed86-51c1-40bb-87e4-aeb8c84e9639/thanos-query/0.log" Apr 22 18:24:56.239206 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:56.239140 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76d946b44c-6cv98_50b8ed86-51c1-40bb-87e4-aeb8c84e9639/kube-rbac-proxy-web/0.log" Apr 22 18:24:56.273749 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:56.273719 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76d946b44c-6cv98_50b8ed86-51c1-40bb-87e4-aeb8c84e9639/kube-rbac-proxy/0.log" Apr 22 18:24:56.293041 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:56.292952 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76d946b44c-6cv98_50b8ed86-51c1-40bb-87e4-aeb8c84e9639/prom-label-proxy/0.log" Apr 22 18:24:56.312646 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:56.312602 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76d946b44c-6cv98_50b8ed86-51c1-40bb-87e4-aeb8c84e9639/kube-rbac-proxy-rules/0.log" Apr 22 18:24:56.378769 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:56.378722 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-76d946b44c-6cv98_50b8ed86-51c1-40bb-87e4-aeb8c84e9639/kube-rbac-proxy-metrics/0.log" Apr 22 18:24:58.714533 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.714468 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kx68p/must-gather-vk4sx" podStartSLOduration=11.908703941 podStartE2EDuration="12.714446606s" podCreationTimestamp="2026-04-22 18:24:46 +0000 UTC" firstStartedPulling="2026-04-22 18:24:47.471561625 +0000 UTC m=+1869.348830774" lastFinishedPulling="2026-04-22 18:24:48.27730429 +0000 UTC m=+1870.154573439" observedRunningTime="2026-04-22 18:24:49.408246528 +0000 UTC m=+1871.285515697" watchObservedRunningTime="2026-04-22 18:24:58.714446606 +0000 UTC m=+1880.591715778" Apr 22 18:24:58.715148 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.714675 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q"] Apr 22 18:24:58.720959 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.720930 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.732353 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.732325 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q"] Apr 22 18:24:58.786407 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.786364 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-proc\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.786601 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.786422 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-sys\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.786601 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.786454 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-podres\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.786757 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.786673 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-lib-modules\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.786757 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.786746 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xstnj\" (UniqueName: \"kubernetes.io/projected/4f6cd5e9-1013-4e0d-9e89-65271759a649-kube-api-access-xstnj\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.887317 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.887277 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-sys\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.887583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.887324 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-podres\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.887583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.887392 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-sys\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.887583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.887434 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-lib-modules\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.887583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.887468 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xstnj\" (UniqueName: \"kubernetes.io/projected/4f6cd5e9-1013-4e0d-9e89-65271759a649-kube-api-access-xstnj\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.887583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.887503 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-podres\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.887583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.887516 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-proc\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.887583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.887566 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-lib-modules\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.887848 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.887595 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4f6cd5e9-1013-4e0d-9e89-65271759a649-proc\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:58.896987 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:58.896961 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xstnj\" (UniqueName: \"kubernetes.io/projected/4f6cd5e9-1013-4e0d-9e89-65271759a649-kube-api-access-xstnj\") pod \"perf-node-gather-daemonset-zsp9q\" (UID: \"4f6cd5e9-1013-4e0d-9e89-65271759a649\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:59.043465 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:59.043365 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:24:59.384797 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:59.384495 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q"] Apr 22 18:24:59.388644 ip-10-0-131-69 kubenswrapper[2583]: W0422 18:24:59.388589 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4f6cd5e9_1013_4e0d_9e89_65271759a649.slice/crio-b1130fca367f330a2fb8d8ec83013a1d74cc04fb9ed9dac7284e2e668b70a7b4 WatchSource:0}: Error finding container b1130fca367f330a2fb8d8ec83013a1d74cc04fb9ed9dac7284e2e668b70a7b4: Status 404 returned error can't find the container with id b1130fca367f330a2fb8d8ec83013a1d74cc04fb9ed9dac7284e2e668b70a7b4 Apr 22 18:24:59.434454 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:59.434423 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" event={"ID":"4f6cd5e9-1013-4e0d-9e89-65271759a649","Type":"ContainerStarted","Data":"b1130fca367f330a2fb8d8ec83013a1d74cc04fb9ed9dac7284e2e668b70a7b4"} Apr 22 18:24:59.720754 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:59.720723 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pbfbf_db6e7b89-b68b-4920-9aaa-d35998aee879/dns/0.log" Apr 22 18:24:59.740792 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:59.740762 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pbfbf_db6e7b89-b68b-4920-9aaa-d35998aee879/kube-rbac-proxy/0.log" Apr 22 18:24:59.821529 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:24:59.821498 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qrk67_9f4638bd-d27c-475b-9ecd-d0faa1ba55d2/dns-node-resolver/0.log" Apr 22 18:25:00.281867 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:00.281837 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7p29d_b56ca0f7-5e42-4d61-9c1d-fff86d2affdd/node-ca/0.log" Apr 22 18:25:00.439056 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:00.439025 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" event={"ID":"4f6cd5e9-1013-4e0d-9e89-65271759a649","Type":"ContainerStarted","Data":"1fffb881f952ba1be4c07a717599c800ae6fa0e0e8ca2647b9f12e6a718d0b57"} Apr 22 18:25:00.439231 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:00.439165 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:25:00.455252 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:00.455202 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" podStartSLOduration=2.455189332 podStartE2EDuration="2.455189332s" podCreationTimestamp="2026-04-22 18:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:25:00.454443492 +0000 UTC m=+1882.331712688" watchObservedRunningTime="2026-04-22 18:25:00.455189332 +0000 UTC m=+1882.332458499" Apr 22 18:25:01.620996 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:01.620970 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dqmzc_e2b78a02-ed41-4b77-adce-5c8606af896b/serve-healthcheck-canary/0.log" Apr 22 18:25:02.066638 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:02.066595 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5skr5_6609b3c4-cfd3-49ff-a1e6-faff9ad000b9/kube-rbac-proxy/0.log" Apr 22 18:25:02.083187 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:02.083154 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5skr5_6609b3c4-cfd3-49ff-a1e6-faff9ad000b9/exporter/0.log" Apr 22 18:25:02.101126 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:02.101094 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5skr5_6609b3c4-cfd3-49ff-a1e6-faff9ad000b9/extractor/0.log" Apr 22 18:25:04.705973 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:04.705942 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5b9bbc5c4d-d2t89_1600c6f0-3eed-49a9-a9d5-12b5e03ed346/manager/0.log" Apr 22 18:25:04.727923 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:04.727892 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-jdvrm_ef528543-6fab-4648-bad4-f26f355c74c9/openshift-lws-operator/0.log" Apr 22 18:25:05.326124 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:05.326095 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-644fd69db4-l8fxs_f101634d-e39a-4227-afbb-ab979ac80dfd/manager/0.log" Apr 22 18:25:05.384439 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:05.384390 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-f7554779f-5xljv_b9ba060a-9eb7-4c6e-ae1b-fb935c3a9d6f/manager/0.log" Apr 22 18:25:05.622011 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:05.621923 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-xqmjx_bf4cae0f-528e-41b2-9e19-292633c04a9c/manager/0.log" Apr 22 18:25:05.643658 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:05.643616 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-hqn8x_d6d9ddca-82ef-4030-be0f-332fbc6fcb61/s3-init/0.log" Apr 22 18:25:06.453695 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:06.453663 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-zsp9q" Apr 22 18:25:11.834841 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:11.834813 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vh7jd_dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4/kube-multus-additional-cni-plugins/0.log" Apr 22 18:25:11.855701 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:11.855678 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vh7jd_dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4/egress-router-binary-copy/0.log" Apr 22 18:25:11.872583 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:11.872562 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vh7jd_dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4/cni-plugins/0.log" Apr 22 18:25:11.892879 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:11.892854 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vh7jd_dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4/bond-cni-plugin/0.log" Apr 22 18:25:11.916784 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:11.916761 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vh7jd_dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4/routeoverride-cni/0.log" Apr 22 18:25:11.943526 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:11.943494 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vh7jd_dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4/whereabouts-cni-bincopy/0.log" Apr 22 18:25:11.960688 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:11.960665 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vh7jd_dd5d086e-cf48-4cf7-91c6-f3ecfb0513a4/whereabouts-cni/0.log" Apr 22 18:25:11.988981 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:11.988957 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kpz56_f625cc6f-1340-411a-b28d-19e397e691de/kube-multus/0.log" Apr 22 18:25:12.129364 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:12.129286 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xk988_c266f31e-39da-4b15-a687-f1304c2e67b7/network-metrics-daemon/0.log" Apr 22 18:25:12.157976 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:12.157934 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xk988_c266f31e-39da-4b15-a687-f1304c2e67b7/kube-rbac-proxy/0.log" Apr 22 18:25:13.678479 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:13.678444 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-controller/0.log" Apr 22 18:25:13.692194 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:13.692170 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/0.log" Apr 22 18:25:13.701058 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:13.700987 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovn-acl-logging/1.log" Apr 22 18:25:13.715493 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:13.715471 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/kube-rbac-proxy-node/0.log" Apr 22 18:25:13.733203 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:13.733185 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:25:13.751151 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:13.751132 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/northd/0.log" Apr 22 18:25:13.768067 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:13.768046 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/nbdb/0.log" Apr 22 18:25:13.784205 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:13.784187 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/sbdb/0.log" Apr 22 18:25:13.897543 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:13.897513 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t6kbb_44a6788e-ee8d-4d8f-9255-fc53fdbd083f/ovnkube-controller/0.log" Apr 22 18:25:15.042330 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:15.042304 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-p96l2_e510245b-cd68-4ee7-8f7d-e72ddbd61118/network-check-target-container/0.log" Apr 22 18:25:15.983846 ip-10-0-131-69 kubenswrapper[2583]: I0422 18:25:15.983800 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-96thx_21690994-61ae-4434-baa1-49a8adf56490/iptables-alerter/0.log"