Apr 17 17:21:12.650606 ip-10-0-130-19 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 17:21:12.650615 ip-10-0-130-19 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 17:21:12.650622 ip-10-0-130-19 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 17:21:12.650833 ip-10-0-130-19 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 17:21:22.892470 ip-10-0-130-19 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 17:21:22.892486 ip-10-0-130-19 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 38a8da935742442690fb8edb8aae2b82 -- Apr 17 17:23:53.897869 ip-10-0-130-19 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:23:54.387232 ip-10-0-130-19 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:23:54.387232 ip-10-0-130-19 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:23:54.387232 ip-10-0-130-19 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:23:54.387232 ip-10-0-130-19 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:23:54.387232 ip-10-0-130-19 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:23:54.389975 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.389884 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:23:54.396816 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396790 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:23:54.396816 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396812 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:23:54.396816 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396816 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:23:54.396816 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396819 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:23:54.396816 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396822 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396826 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396829 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396832 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396834 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396837 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396839 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396842 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396845 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396847 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396853 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396858 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396861 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396864 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396867 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396869 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396873 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396876 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396879 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:23:54.397018 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396886 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396889 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396892 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396894 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396897 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396900 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396903 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396906 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396909 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396912 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396914 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396917 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396919 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396922 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396925 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396928 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396931 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396934 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396937 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396939 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:23:54.397507 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396943 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396945 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396949 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396951 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396954 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396956 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396960 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396963 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396966 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396968 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396972 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396974 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396977 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396979 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396982 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396985 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396988 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396990 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396993 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396996 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:23:54.398033 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.396999 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397001 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397004 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397006 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397009 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397013 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397018 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397022 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397025 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397028 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397032 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397035 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397037 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397040 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397043 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397046 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397049 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397052 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397054 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397057 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:23:54.398516 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397060 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397063 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397065 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397472 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397477 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397481 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397484 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397487 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397490 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397493 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397496 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397498 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397501 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397504 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397506 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397509 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397511 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397514 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397517 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397519 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:23:54.399010 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397523 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397525 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397528 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397530 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397533 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397535 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397538 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397542 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397544 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397547 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397549 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397552 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397555 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397558 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397560 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397563 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397565 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397568 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397571 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397573 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:23:54.399488 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397592 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397596 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397600 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397604 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397608 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397612 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397614 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397617 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397619 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397622 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397625 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397628 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397632 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397635 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397638 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397650 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397653 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397656 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397658 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:23:54.400000 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397661 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397665 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397667 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397670 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397673 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397676 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397679 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397681 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397684 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397687 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397690 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397692 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397695 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397697 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397700 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397702 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397705 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397707 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397710 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:23:54.400457 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397714 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397717 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397720 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397723 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397726 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397730 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397732 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397735 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397737 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397741 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.397743 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397825 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397832 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397839 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397844 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397850 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397853 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397858 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397864 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397867 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397870 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:23:54.400938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397874 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397878 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397881 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397884 2580 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397887 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397890 2580 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397893 2580 flags.go:64] FLAG: --cloud-config="" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397895 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397898 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397902 2580 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397905 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397908 2580 flags.go:64] FLAG: --config-dir="" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397911 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397915 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397919 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397923 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397926 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397929 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397933 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397935 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397939 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397942 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397945 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397949 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397952 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:23:54.401485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397955 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397958 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397962 2580 flags.go:64] FLAG: --enable-server="true" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397965 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397970 2580 flags.go:64] FLAG: --event-burst="100" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397973 2580 flags.go:64] FLAG: --event-qps="50" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397977 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397980 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397983 2580 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397987 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397990 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397992 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397995 2580 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.397999 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398001 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398004 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398007 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398010 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398013 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398016 2580 flags.go:64] FLAG: --feature-gates="" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398020 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398023 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398027 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398030 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398033 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:23:54.402110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398036 2580 flags.go:64] FLAG: --help="false" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398039 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398043 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398046 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398049 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398052 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398056 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398059 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398061 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398064 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398069 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398072 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398075 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398078 2580 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398081 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398084 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398087 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398090 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398093 2580 flags.go:64] FLAG: --lock-file="" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398096 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398099 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398102 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398107 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:23:54.402740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398110 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398113 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398116 2580 flags.go:64] FLAG: --logging-format="text" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398119 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398122 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398125 2580 flags.go:64] FLAG: --manifest-url="" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398128 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398133 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398136 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398140 2580 flags.go:64] FLAG: --max-pods="110" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398143 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398146 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398149 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398152 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398155 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398158 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398161 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398168 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398171 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398175 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398178 2580 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398181 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398186 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398189 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:23:54.403324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398192 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398195 2580 flags.go:64] FLAG: --port="10250" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398198 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398201 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0394929aa2294274f" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398204 2580 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398210 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398213 2580 flags.go:64] FLAG: --register-node="true" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398216 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398219 2580 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398227 2580 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398230 2580 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398233 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398236 2580 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398240 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398243 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398246 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398248 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398251 2580 flags.go:64] FLAG: --runonce="false" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398254 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398257 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398260 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398263 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398266 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398269 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398272 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398275 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:23:54.403916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398278 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398281 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398284 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398287 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398291 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398294 2580 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398297 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398302 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398305 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398308 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398312 2580 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398316 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398319 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398322 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398325 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398328 2580 flags.go:64] FLAG: --v="2" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398333 2580 flags.go:64] FLAG: --version="false" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398337 2580 flags.go:64] FLAG: --vmodule="" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398341 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.398345 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398436 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398439 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398442 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398445 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:23:54.404595 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398448 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398451 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398454 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398457 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398460 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398463 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398469 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398472 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398475 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398477 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398480 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398482 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398485 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398488 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398490 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398493 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398496 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398499 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398501 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398508 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:23:54.405205 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398510 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398513 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398516 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398518 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398521 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398524 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398527 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398530 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398532 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398535 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398537 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398540 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398543 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398545 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398548 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398551 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398553 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398555 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398559 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398562 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:23:54.405749 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398564 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398567 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398570 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398572 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398575 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398592 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398597 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398601 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398604 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398607 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398610 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398613 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398616 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398619 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398621 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398624 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398627 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398629 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398632 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398634 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:23:54.406334 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398637 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398640 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398643 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398645 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398648 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398650 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398652 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398655 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398659 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398662 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398666 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398669 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398672 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398674 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398676 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398679 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398681 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398684 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398687 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398690 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:23:54.406889 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398692 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.398697 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.399806 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.406686 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.406706 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406757 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406762 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406766 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406769 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406772 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406775 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406778 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406781 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406784 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406786 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:23:54.407387 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406789 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406792 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406796 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406799 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406801 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406804 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406807 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406809 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406813 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406816 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406818 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406821 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406823 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406826 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406828 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406831 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406834 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406837 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406839 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406842 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:23:54.407778 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406844 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406849 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406852 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406855 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406858 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406860 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406863 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406866 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406868 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406871 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406875 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406878 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406881 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406884 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406886 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406889 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406892 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406894 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406897 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406900 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:23:54.408300 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406903 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406905 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406908 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406911 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406914 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406916 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406919 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406921 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406924 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406926 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406929 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406931 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406934 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406937 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406940 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406943 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406945 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406948 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406950 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406953 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:23:54.408798 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406955 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406957 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406960 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406963 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406965 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406968 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406970 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406972 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406975 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406978 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406980 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406983 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406985 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406988 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406991 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:23:54.409286 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.406995 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.407000 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407108 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407112 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407115 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407117 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407120 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407123 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407125 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407128 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407131 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407133 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407137 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407140 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407142 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407145 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:23:54.409672 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407148 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407150 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407153 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407156 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407159 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407162 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407164 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407166 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407169 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407172 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407174 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407177 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407180 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407184 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407187 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407190 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407193 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407195 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407198 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:23:54.410066 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407200 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407203 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407205 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407208 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407210 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407213 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407216 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407219 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407221 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407224 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407227 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407230 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407233 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407236 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407238 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407241 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407243 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407246 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407248 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407251 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:23:54.410669 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407253 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407256 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407259 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407262 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407264 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407266 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407269 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407271 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407274 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407276 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407279 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407281 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407284 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407286 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407288 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407291 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407293 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407296 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407299 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:23:54.411160 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407301 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407304 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407306 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407309 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407312 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407315 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407317 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407320 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407322 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407325 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407327 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407330 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407332 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:54.407335 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.407340 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:23:54.411790 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.408146 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:23:54.412199 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.410229 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:23:54.412199 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.411279 2580 server.go:1019] "Starting client certificate rotation" Apr 17 17:23:54.412199 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.411393 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:23:54.412199 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.411453 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:23:54.438399 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.438368 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:23:54.443972 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.443947 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:23:54.458631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.458590 2580 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:23:54.465423 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.465405 2580 log.go:25] "Validated CRI v1 image API" Apr 17 17:23:54.468959 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.468940 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:23:54.473964 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.473937 2580 fs.go:135] Filesystem UUIDs: map[32133fdf-5b95-4ba7-bf32-5ad7166059c8:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 eaa20d55-1136-416b-9da6-8e841428e4a5:/dev/nvme0n1p3] Apr 17 17:23:54.474026 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.473963 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:23:54.474094 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.474079 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:23:54.480308 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.480189 2580 manager.go:217] Machine: {Timestamp:2026-04-17 17:23:54.478105119 +0000 UTC m=+0.449388451 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098956 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21e010484a0436678b08b2a6b719c9 SystemUUID:ec21e010-484a-0436-678b-08b2a6b719c9 BootID:38a8da93-5742-4426-90fb-8edb8aae2b82 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fe:af:a6:26:4b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fe:af:a6:26:4b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f2:bb:e5:03:41:21 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:23:54.480308 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.480303 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:23:54.480447 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.480434 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:23:54.481529 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.481502 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:23:54.481702 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.481532 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-19.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:23:54.481756 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.481711 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:23:54.481756 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.481721 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:23:54.481756 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.481734 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:23:54.482352 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.482341 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:23:54.483196 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.483185 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:23:54.483299 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.483290 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:23:54.486185 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.486164 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:23:54.486236 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.486195 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:23:54.486236 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.486208 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:23:54.486236 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.486223 2580 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:23:54.486367 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.486241 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:23:54.487245 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.487233 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:23:54.487300 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.487261 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:23:54.490610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.490590 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:23:54.492115 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.492101 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:23:54.492415 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.492401 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-558pq" Apr 17 17:23:54.494245 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494214 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:23:54.494245 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494231 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:23:54.494245 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494237 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:23:54.494245 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494243 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:23:54.494368 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494249 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:23:54.494368 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494254 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:23:54.494368 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494261 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:23:54.494368 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494270 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:23:54.494368 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494279 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:23:54.494368 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494288 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:23:54.494368 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494297 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:23:54.494368 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.494305 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:23:54.495362 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.495351 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:23:54.495362 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.495363 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:23:54.496622 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.496599 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-19.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:23:54.496677 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.496622 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:23:54.499215 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.499202 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:23:54.499258 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.499241 2580 server.go:1295] "Started kubelet" Apr 17 17:23:54.499338 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.499312 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:23:54.499448 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.499399 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:23:54.499504 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.499477 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:23:54.500182 ip-10-0-130-19 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:23:54.500462 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.500435 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-558pq" Apr 17 17:23:54.500883 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.500747 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:23:54.501943 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.501929 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:23:54.506629 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.506610 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:23:54.507484 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.507465 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:23:54.508403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.508385 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:23:54.508403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.508406 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:23:54.508717 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.508704 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:23:54.508830 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.508819 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:23:54.508830 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.508831 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:23:54.509621 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.509557 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:54.511650 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.511623 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:23:54.512282 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.512260 2580 factory.go:55] Registering systemd factory Apr 17 17:23:54.512372 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.512291 2580 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:23:54.512810 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.512787 2580 factory.go:153] Registering CRI-O factory Apr 17 17:23:54.512810 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.512808 2580 factory.go:223] Registration of the crio container factory successfully Apr 17 17:23:54.512919 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.512910 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:23:54.512969 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.512934 2580 factory.go:103] Registering Raw factory Apr 17 17:23:54.512969 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.512955 2580 manager.go:1196] Started watching for new ooms in manager Apr 17 17:23:54.513668 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.513650 2580 manager.go:319] Starting recovery of all containers Apr 17 17:23:54.514264 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.514238 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:23:54.517808 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.517785 2580 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-19.ec2.internal\" not found" node="ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.517905 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.517873 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-19.ec2.internal" not found Apr 17 17:23:54.523205 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.523191 2580 manager.go:324] Recovery completed Apr 17 17:23:54.527259 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.527247 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:54.529651 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.529629 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:54.529708 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.529664 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:54.529708 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.529675 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:54.530198 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.530184 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:23:54.530198 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.530196 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:23:54.530268 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.530212 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:23:54.532569 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.532555 2580 policy_none.go:49] "None policy: Start" Apr 17 17:23:54.532632 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.532571 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:23:54.532632 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.532593 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:23:54.534557 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.534540 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-19.ec2.internal" not found Apr 17 17:23:54.572164 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.572148 2580 manager.go:341] "Starting Device Plugin manager" Apr 17 17:23:54.572283 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.572257 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:23:54.572283 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.572271 2580 server.go:85] "Starting device plugin registration server" Apr 17 17:23:54.572555 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.572543 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:23:54.572650 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.572558 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:23:54.572718 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.572701 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:23:54.572790 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.572778 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:23:54.572862 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.572791 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:23:54.573339 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.573317 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:23:54.573415 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.573365 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:54.591721 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.591691 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-19.ec2.internal" not found Apr 17 17:23:54.634461 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.634414 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:23:54.635857 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.635840 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:23:54.635934 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.635867 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:23:54.635934 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.635886 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:23:54.635934 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.635892 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:23:54.635934 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.635927 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:23:54.639083 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.639028 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:23:54.673098 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.673071 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:54.674060 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.674043 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:54.674151 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.674075 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:54.674151 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.674090 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:54.674151 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.674116 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.684134 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.684115 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.684206 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.684139 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-19.ec2.internal\": node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:54.704331 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.704299 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:54.736618 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.736546 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal"] Apr 17 17:23:54.736757 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.736683 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:54.738744 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.738728 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:54.738825 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.738763 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:54.738825 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.738775 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:54.739936 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.739925 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:54.740101 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.740086 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.740155 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.740118 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:54.740722 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.740702 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:54.740810 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.740725 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:54.740810 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.740750 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:54.740810 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.740766 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:54.740810 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.740728 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:54.741000 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.740812 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:54.742128 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.742113 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.742198 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.742140 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:54.743290 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.743276 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:54.743359 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.743298 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:54.743359 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.743312 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:54.780474 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.780450 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-19.ec2.internal\" not found" node="ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.784774 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.784759 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-19.ec2.internal\" not found" node="ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.805015 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.804988 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:54.810309 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.810292 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.810385 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.810318 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.810385 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.810340 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a5e99fc6db543cf6951686e44ee274cc-config\") pod \"kube-apiserver-proxy-ip-10-0-130-19.ec2.internal\" (UID: \"a5e99fc6db543cf6951686e44ee274cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.905273 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:54.905188 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:54.910487 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.910462 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.910608 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.910504 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a5e99fc6db543cf6951686e44ee274cc-config\") pod \"kube-apiserver-proxy-ip-10-0-130-19.ec2.internal\" (UID: \"a5e99fc6db543cf6951686e44ee274cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.910608 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.910531 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.910608 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.910601 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a5e99fc6db543cf6951686e44ee274cc-config\") pod \"kube-apiserver-proxy-ip-10-0-130-19.ec2.internal\" (UID: \"a5e99fc6db543cf6951686e44ee274cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.910726 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.910612 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 17 17:23:54.910726 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:54.910607 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 17 17:23:55.005968 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:55.005923 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:55.082417 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.082394 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 17 17:23:55.088127 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.088102 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 17 17:23:55.106902 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:55.106878 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:55.207626 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:55.207517 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:55.308046 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:55.308008 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:55.408464 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:55.408429 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:55.410608 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.410595 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:23:55.410733 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.410718 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:23:55.410794 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.410748 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:23:55.502404 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.502346 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:18:54 +0000 UTC" deadline="2027-11-30 15:37:13.864155959 +0000 UTC" Apr 17 17:23:55.502404 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.502398 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14206h13m18.361761531s" Apr 17 17:23:55.507501 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.507476 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:23:55.509203 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:55.509181 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:55.526743 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.526717 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:23:55.553125 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.553094 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kp5p4" Apr 17 17:23:55.561255 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.561227 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kp5p4" Apr 17 17:23:55.581339 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:55.581297 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5e99fc6db543cf6951686e44ee274cc.slice/crio-6e737747b26f5c5f59e8aaa607a39dabe95b0f48a50dcdf4b7a8b96941844cbf WatchSource:0}: Error finding container 6e737747b26f5c5f59e8aaa607a39dabe95b0f48a50dcdf4b7a8b96941844cbf: Status 404 returned error can't find the container with id 6e737747b26f5c5f59e8aaa607a39dabe95b0f48a50dcdf4b7a8b96941844cbf Apr 17 17:23:55.581626 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:55.581612 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62d93d1676550c945f317175b90f5b9f.slice/crio-33ec2fc2e5946bfc16d198a4aea6e9a5fe67794e6cbec8e1613f2cea5a3ed2bd WatchSource:0}: Error finding container 33ec2fc2e5946bfc16d198a4aea6e9a5fe67794e6cbec8e1613f2cea5a3ed2bd: Status 404 returned error can't find the container with id 33ec2fc2e5946bfc16d198a4aea6e9a5fe67794e6cbec8e1613f2cea5a3ed2bd Apr 17 17:23:55.586439 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.586415 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:23:55.605169 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.605142 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:23:55.609355 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:55.609335 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:55.639402 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.639349 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" event={"ID":"62d93d1676550c945f317175b90f5b9f","Type":"ContainerStarted","Data":"33ec2fc2e5946bfc16d198a4aea6e9a5fe67794e6cbec8e1613f2cea5a3ed2bd"} Apr 17 17:23:55.640222 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.640200 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" event={"ID":"a5e99fc6db543cf6951686e44ee274cc","Type":"ContainerStarted","Data":"6e737747b26f5c5f59e8aaa607a39dabe95b0f48a50dcdf4b7a8b96941844cbf"} Apr 17 17:23:55.709441 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:55.709417 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:55.809827 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:55.809798 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 17 17:23:55.826855 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.826828 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:23:55.909022 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.908969 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 17 17:23:55.918722 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.918695 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:23:55.919806 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.919792 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 17 17:23:55.929701 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:55.929673 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:23:56.375921 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.375893 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:23:56.487417 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.487385 2580 apiserver.go:52] "Watching apiserver" Apr 17 17:23:56.494825 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.494802 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:23:56.496888 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.496861 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-knvfd","openshift-ovn-kubernetes/ovnkube-node-rjptt","kube-system/konnectivity-agent-nz4ff","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69","openshift-cluster-node-tuning-operator/tuned-8s5r4","openshift-image-registry/node-ca-8db7c","openshift-network-diagnostics/network-check-target-4qnzz","openshift-network-operator/iptables-alerter-sj6nb","kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal","openshift-dns/node-resolver-pch4m","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal","openshift-multus/multus-72h2h","openshift-multus/multus-additional-cni-plugins-5nrtn"] Apr 17 17:23:56.498417 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.498395 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:23:56.498540 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:56.498471 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:23:56.499791 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.499768 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.500909 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.500869 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:23:56.503039 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.502617 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.503610 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.503752 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.503871 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.503918 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.504102 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.504193 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.504485 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.504491 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.504732 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jvc75\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.505293 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pkz6h\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.505345 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.505351 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:23:56.505737 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.505411 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:23:56.506351 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.505841 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.506706 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.506667 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-q47cq\"" Apr 17 17:23:56.507847 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.507823 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8db7c" Apr 17 17:23:56.507971 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.507896 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9zw8g\"" Apr 17 17:23:56.508060 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.508016 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:23:56.508134 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.508124 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:23:56.509317 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.509297 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:23:56.509436 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:56.509366 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:23:56.510228 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.510191 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:23:56.510228 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.510203 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:23:56.510228 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.510219 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pgjpq\"" Apr 17 17:23:56.510415 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.510343 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:23:56.510693 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.510674 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sj6nb" Apr 17 17:23:56.512878 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.512859 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-cwrgc\"" Apr 17 17:23:56.512957 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.512902 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:23:56.512957 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.512863 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:23:56.513065 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.513028 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:23:56.513657 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.513632 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pch4m" Apr 17 17:23:56.513735 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.513689 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.515241 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.515224 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.515931 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.515916 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:23:56.516187 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.516172 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7kqdp\"" Apr 17 17:23:56.516269 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.516177 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:23:56.516470 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.516451 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:23:56.516556 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.516528 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:23:56.516641 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.516563 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:23:56.516641 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.516609 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-45gx9\"" Apr 17 17:23:56.516641 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.516621 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:23:56.517293 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517274 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-sysctl-conf\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.517382 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517327 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-host\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.517434 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517385 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-socket-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.517434 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517406 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:23:56.517519 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517433 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-kubelet\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.517519 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517475 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-etc-openvswitch\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.517519 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517501 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-node-log\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.517684 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517546 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-log-socket\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.517684 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517569 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmxjr\" (UniqueName: \"kubernetes.io/projected/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-kube-api-access-zmxjr\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.517684 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517655 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-modprobe-d\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.517813 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517703 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-run\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.517813 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517731 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sdbs\" (UniqueName: \"kubernetes.io/projected/43be0a75-8199-4093-a744-921df8b3380b-kube-api-access-9sdbs\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.517813 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517765 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:23:56.517813 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517806 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:23:56.517813 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517811 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-run-netns\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.518043 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517836 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-ovnkube-script-lib\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.518043 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517859 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3f262b32-c02c-41bc-be72-1f8ea9896bfd-agent-certs\") pod \"konnectivity-agent-nz4ff\" (UID: \"3f262b32-c02c-41bc-be72-1f8ea9896bfd\") " pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:23:56.518043 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.517907 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-sysconfig\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.518043 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518004 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-var-lib-kubelet\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.518043 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518031 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vktcn\"" Apr 17 17:23:56.518250 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518047 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31b6121d-8e98-43cd-84cf-8f938f63e6bd-tmp\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.518250 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518074 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c-serviceca\") pod \"node-ca-8db7c\" (UID: \"9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c\") " pod="openshift-image-registry/node-ca-8db7c" Apr 17 17:23:56.518250 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518118 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-var-lib-openvswitch\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.518250 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518152 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-run-ovn-kubernetes\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.518250 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518188 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-tuned\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.518250 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518204 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv28f\" (UniqueName: \"kubernetes.io/projected/9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c-kube-api-access-pv28f\") pod \"node-ca-8db7c\" (UID: \"9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c\") " pod="openshift-image-registry/node-ca-8db7c" Apr 17 17:23:56.518250 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518219 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvdcb\" (UniqueName: \"kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb\") pod \"network-check-target-4qnzz\" (UID: \"850cf630-0fb1-482f-9e3d-a1525bdf6a39\") " pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:23:56.518250 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518242 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518267 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3f262b32-c02c-41bc-be72-1f8ea9896bfd-konnectivity-ca\") pod \"konnectivity-agent-nz4ff\" (UID: \"3f262b32-c02c-41bc-be72-1f8ea9896bfd\") " pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518294 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-kubernetes\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518316 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24zmk\" (UniqueName: \"kubernetes.io/projected/31b6121d-8e98-43cd-84cf-8f938f63e6bd-kube-api-access-24zmk\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518356 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518380 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-registration-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518403 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-etc-selinux\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518428 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h72c7\" (UniqueName: \"kubernetes.io/projected/1227f475-d747-4720-ad95-d72a46d6d1fb-kube-api-access-h72c7\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518451 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-cni-netd\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518476 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-sys-fs\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518499 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-systemd-units\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518523 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-run-systemd\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.518631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-run-openvswitch\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518572 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-run-ovn\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518763 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-ovnkube-config\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518806 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-systemd\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518836 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-lib-modules\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518858 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-slash\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518897 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-cni-bin\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-env-overrides\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.518975 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-ovn-node-metrics-cert\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.519013 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-sysctl-d\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.519062 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-sys\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.519110 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c-host\") pod \"node-ca-8db7c\" (UID: \"9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c\") " pod="openshift-image-registry/node-ca-8db7c" Apr 17 17:23:56.519264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.519135 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-device-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.561928 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.561894 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:18:55 +0000 UTC" deadline="2027-11-27 00:37:05.245359614 +0000 UTC" Apr 17 17:23:56.562114 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.562093 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14119h13m8.683275483s" Apr 17 17:23:56.610054 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.610025 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:23:56.620080 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620051 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdcb\" (UniqueName: \"kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb\") pod \"network-check-target-4qnzz\" (UID: \"850cf630-0fb1-482f-9e3d-a1525bdf6a39\") " pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:23:56.620296 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620087 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.620296 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620138 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-cni-dir\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.620296 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620161 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-run-k8s-cni-cncf-io\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.620296 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620182 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-var-lib-cni-bin\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.620296 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620206 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-conf-dir\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.620296 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620247 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.620296 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620259 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3f262b32-c02c-41bc-be72-1f8ea9896bfd-konnectivity-ca\") pod \"konnectivity-agent-nz4ff\" (UID: \"3f262b32-c02c-41bc-be72-1f8ea9896bfd\") " pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620309 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-kubernetes\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620328 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24zmk\" (UniqueName: \"kubernetes.io/projected/31b6121d-8e98-43cd-84cf-8f938f63e6bd-kube-api-access-24zmk\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620346 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-registration-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620364 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-system-cni-dir\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620385 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdpmz\" (UniqueName: \"kubernetes.io/projected/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-kube-api-access-vdpmz\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620409 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/23dab589-f077-4e94-93bc-392122228de4-tmp-dir\") pod \"node-resolver-pch4m\" (UID: \"23dab589-f077-4e94-93bc-392122228de4\") " pod="openshift-dns/node-resolver-pch4m" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620428 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-kubernetes\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620440 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-registration-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620473 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/afb0cf40-4c7d-4082-a5f2-64ef60067cde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620509 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-sys-fs\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620544 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-run-openvswitch\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620605 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-run-ovn\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620611 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-sys-fs\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620613 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-run-openvswitch\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620652 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/23dab589-f077-4e94-93bc-392122228de4-hosts-file\") pod \"node-resolver-pch4m\" (UID: \"23dab589-f077-4e94-93bc-392122228de4\") " pod="openshift-dns/node-resolver-pch4m" Apr 17 17:23:56.620656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620662 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-run-ovn\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620692 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620736 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-systemd\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620769 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-lib-modules\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620786 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-systemd\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620793 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-cni-bin\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620819 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-env-overrides\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620845 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-ovn-node-metrics-cert\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620874 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-var-lib-kubelet\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620884 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-cni-bin\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620890 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-lib-modules\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620905 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-cnibin\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.620927 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3f262b32-c02c-41bc-be72-1f8ea9896bfd-konnectivity-ca\") pod \"konnectivity-agent-nz4ff\" (UID: \"3f262b32-c02c-41bc-be72-1f8ea9896bfd\") " pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621007 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-device-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621043 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-hostroot\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621063 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-device-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621067 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl4pz\" (UniqueName: \"kubernetes.io/projected/23dab589-f077-4e94-93bc-392122228de4-kube-api-access-hl4pz\") pod \"node-resolver-pch4m\" (UID: \"23dab589-f077-4e94-93bc-392122228de4\") " pod="openshift-dns/node-resolver-pch4m" Apr 17 17:23:56.621322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621090 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxgb\" (UniqueName: \"kubernetes.io/projected/afb0cf40-4c7d-4082-a5f2-64ef60067cde-kube-api-access-9cxgb\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621118 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-kubelet\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621158 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-etc-openvswitch\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621214 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-etc-openvswitch\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621220 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-log-socket\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621246 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-kubelet\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621303 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-run-multus-certs\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621317 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-log-socket\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621321 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-env-overrides\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621411 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/afb0cf40-4c7d-4082-a5f2-64ef60067cde-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621457 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sdbs\" (UniqueName: \"kubernetes.io/projected/43be0a75-8199-4093-a744-921df8b3380b-kube-api-access-9sdbs\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621487 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621493 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621513 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-run-netns\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621541 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-system-cni-dir\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621571 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-run-netns\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621567 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3f262b32-c02c-41bc-be72-1f8ea9896bfd-agent-certs\") pod \"konnectivity-agent-nz4ff\" (UID: \"3f262b32-c02c-41bc-be72-1f8ea9896bfd\") " pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:56.621645 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:56.622120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621687 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-sysconfig\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621647 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-sysconfig\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:56.621717 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs podName:1227f475-d747-4720-ad95-d72a46d6d1fb nodeName:}" failed. No retries permitted until 2026-04-17 17:23:57.121685522 +0000 UTC m=+3.092968847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs") pod "network-metrics-daemon-knvfd" (UID: "1227f475-d747-4720-ad95-d72a46d6d1fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621750 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-var-lib-kubelet\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621817 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c-serviceca\") pod \"node-ca-8db7c\" (UID: \"9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c\") " pod="openshift-image-registry/node-ca-8db7c" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621834 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-var-lib-openvswitch\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621854 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-tuned\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621887 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-var-lib-kubelet\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621894 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-var-lib-openvswitch\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621929 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pv28f\" (UniqueName: \"kubernetes.io/projected/9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c-kube-api-access-pv28f\") pod \"node-ca-8db7c\" (UID: \"9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c\") " pod="openshift-image-registry/node-ca-8db7c" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621958 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45be972b-ce44-43f8-9b8b-860260b4c7ab-host-slash\") pod \"iptables-alerter-sj6nb\" (UID: \"45be972b-ce44-43f8-9b8b-860260b4c7ab\") " pod="openshift-network-operator/iptables-alerter-sj6nb" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.621982 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-daemon-config\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622005 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-etc-kubernetes\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622032 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-etc-selinux\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622100 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h72c7\" (UniqueName: \"kubernetes.io/projected/1227f475-d747-4720-ad95-d72a46d6d1fb-kube-api-access-h72c7\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:23:56.622968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622125 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-cni-netd\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622152 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-ovnkube-config\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622183 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/45be972b-ce44-43f8-9b8b-860260b4c7ab-iptables-alerter-script\") pod \"iptables-alerter-sj6nb\" (UID: \"45be972b-ce44-43f8-9b8b-860260b4c7ab\") " pod="openshift-network-operator/iptables-alerter-sj6nb" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622214 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4vxj\" (UniqueName: \"kubernetes.io/projected/45be972b-ce44-43f8-9b8b-860260b4c7ab-kube-api-access-l4vxj\") pod \"iptables-alerter-sj6nb\" (UID: \"45be972b-ce44-43f8-9b8b-860260b4c7ab\") " pod="openshift-network-operator/iptables-alerter-sj6nb" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622227 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622253 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afb0cf40-4c7d-4082-a5f2-64ef60067cde-cni-binary-copy\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622259 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c-serviceca\") pod \"node-ca-8db7c\" (UID: \"9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c\") " pod="openshift-image-registry/node-ca-8db7c" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622290 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-systemd-units\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622307 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-cni-netd\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622353 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-etc-selinux\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622388 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-run-systemd\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622414 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-slash\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622418 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-run-systemd\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622439 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-cnibin\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622571 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-socket-dir-parent\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622616 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-sysctl-d\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622651 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-systemd-units\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.623710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622641 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-sys\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622684 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c-host\") pod \"node-ca-8db7c\" (UID: \"9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c\") " pod="openshift-image-registry/node-ca-8db7c" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622719 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-os-release\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622743 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-run-netns\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622721 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-slash\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622809 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-ovnkube-config\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622820 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-var-lib-cni-multus\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622832 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-sysctl-d\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622848 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-sysctl-conf\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622869 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c-host\") pod \"node-ca-8db7c\" (UID: \"9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c\") " pod="openshift-image-registry/node-ca-8db7c" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622883 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-sys\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622920 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-host\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622986 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-socket-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622995 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-host\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.622991 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-sysctl-conf\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623019 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-node-log\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623072 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmxjr\" (UniqueName: \"kubernetes.io/projected/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-kube-api-access-zmxjr\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623114 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-cni-binary-copy\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.624408 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623144 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-node-log\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623191 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-os-release\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623223 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-modprobe-d\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623248 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-run\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623287 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-run\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623295 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-ovnkube-script-lib\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623327 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31b6121d-8e98-43cd-84cf-8f938f63e6bd-tmp\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623377 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-modprobe-d\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623387 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-run-ovn-kubernetes\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623530 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-host-run-ovn-kubernetes\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/43be0a75-8199-4093-a744-921df8b3380b-socket-dir\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.623875 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-ovnkube-script-lib\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.624811 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-ovn-node-metrics-cert\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.625144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.624930 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3f262b32-c02c-41bc-be72-1f8ea9896bfd-agent-certs\") pod \"konnectivity-agent-nz4ff\" (UID: \"3f262b32-c02c-41bc-be72-1f8ea9896bfd\") " pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:23:56.625492 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.625435 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/31b6121d-8e98-43cd-84cf-8f938f63e6bd-etc-tuned\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.625492 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.625450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31b6121d-8e98-43cd-84cf-8f938f63e6bd-tmp\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.630157 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:56.630104 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:23:56.630157 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:56.630124 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:23:56.630157 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:56.630137 2580 projected.go:194] Error preparing data for projected volume kube-api-access-zvdcb for pod openshift-network-diagnostics/network-check-target-4qnzz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:56.630348 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:56.630200 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb podName:850cf630-0fb1-482f-9e3d-a1525bdf6a39 nodeName:}" failed. No retries permitted until 2026-04-17 17:23:57.130181615 +0000 UTC m=+3.101464934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zvdcb" (UniqueName: "kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb") pod "network-check-target-4qnzz" (UID: "850cf630-0fb1-482f-9e3d-a1525bdf6a39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:56.631078 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.631036 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24zmk\" (UniqueName: \"kubernetes.io/projected/31b6121d-8e98-43cd-84cf-8f938f63e6bd-kube-api-access-24zmk\") pod \"tuned-8s5r4\" (UID: \"31b6121d-8e98-43cd-84cf-8f938f63e6bd\") " pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.632877 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.632844 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmxjr\" (UniqueName: \"kubernetes.io/projected/6afc6d79-46b9-4af3-84d9-3ed59a13c61a-kube-api-access-zmxjr\") pod \"ovnkube-node-rjptt\" (UID: \"6afc6d79-46b9-4af3-84d9-3ed59a13c61a\") " pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.633602 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.633562 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv28f\" (UniqueName: \"kubernetes.io/projected/9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c-kube-api-access-pv28f\") pod \"node-ca-8db7c\" (UID: \"9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c\") " pod="openshift-image-registry/node-ca-8db7c" Apr 17 17:23:56.633712 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.633684 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sdbs\" (UniqueName: \"kubernetes.io/projected/43be0a75-8199-4093-a744-921df8b3380b-kube-api-access-9sdbs\") pod \"aws-ebs-csi-driver-node-khc69\" (UID: \"43be0a75-8199-4093-a744-921df8b3380b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.633762 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.633735 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h72c7\" (UniqueName: \"kubernetes.io/projected/1227f475-d747-4720-ad95-d72a46d6d1fb-kube-api-access-h72c7\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:23:56.684219 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.684181 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:23:56.723836 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.723789 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45be972b-ce44-43f8-9b8b-860260b4c7ab-host-slash\") pod \"iptables-alerter-sj6nb\" (UID: \"45be972b-ce44-43f8-9b8b-860260b4c7ab\") " pod="openshift-network-operator/iptables-alerter-sj6nb" Apr 17 17:23:56.723836 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.723834 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-daemon-config\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724067 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.723858 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-etc-kubernetes\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724067 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.723884 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/45be972b-ce44-43f8-9b8b-860260b4c7ab-iptables-alerter-script\") pod \"iptables-alerter-sj6nb\" (UID: \"45be972b-ce44-43f8-9b8b-860260b4c7ab\") " pod="openshift-network-operator/iptables-alerter-sj6nb" Apr 17 17:23:56.724067 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.723913 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4vxj\" (UniqueName: \"kubernetes.io/projected/45be972b-ce44-43f8-9b8b-860260b4c7ab-kube-api-access-l4vxj\") pod \"iptables-alerter-sj6nb\" (UID: \"45be972b-ce44-43f8-9b8b-860260b4c7ab\") " pod="openshift-network-operator/iptables-alerter-sj6nb" Apr 17 17:23:56.724067 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.723881 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45be972b-ce44-43f8-9b8b-860260b4c7ab-host-slash\") pod \"iptables-alerter-sj6nb\" (UID: \"45be972b-ce44-43f8-9b8b-860260b4c7ab\") " pod="openshift-network-operator/iptables-alerter-sj6nb" Apr 17 17:23:56.724067 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.723937 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afb0cf40-4c7d-4082-a5f2-64ef60067cde-cni-binary-copy\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.724067 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724014 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-cnibin\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724067 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724045 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-socket-dir-parent\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724067 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724043 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-etc-kubernetes\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724094 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-os-release\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724108 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-socket-dir-parent\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724125 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-run-netns\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724152 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-var-lib-cni-multus\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724162 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-cnibin\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724191 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-cni-binary-copy\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724199 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-var-lib-cni-multus\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724216 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-os-release\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724263 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-cni-dir\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724267 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-os-release\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724289 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-run-k8s-cni-cncf-io\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724315 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-var-lib-cni-bin\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724316 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-run-netns\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724323 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-os-release\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724341 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-conf-dir\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724369 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-system-cni-dir\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724380 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-run-k8s-cni-cncf-io\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.724403 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724397 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdpmz\" (UniqueName: \"kubernetes.io/projected/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-kube-api-access-vdpmz\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724420 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/23dab589-f077-4e94-93bc-392122228de4-tmp-dir\") pod \"node-resolver-pch4m\" (UID: \"23dab589-f077-4e94-93bc-392122228de4\") " pod="openshift-dns/node-resolver-pch4m" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724442 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-cni-dir\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724444 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/afb0cf40-4c7d-4082-a5f2-64ef60067cde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724478 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/23dab589-f077-4e94-93bc-392122228de4-hosts-file\") pod \"node-resolver-pch4m\" (UID: \"23dab589-f077-4e94-93bc-392122228de4\") " pod="openshift-dns/node-resolver-pch4m" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724492 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-system-cni-dir\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724508 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724529 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-var-lib-cni-bin\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724542 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-var-lib-kubelet\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724547 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-conf-dir\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724568 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-cnibin\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724477 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/45be972b-ce44-43f8-9b8b-860260b4c7ab-iptables-alerter-script\") pod \"iptables-alerter-sj6nb\" (UID: \"45be972b-ce44-43f8-9b8b-860260b4c7ab\") " pod="openshift-network-operator/iptables-alerter-sj6nb" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724738 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724510 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afb0cf40-4c7d-4082-a5f2-64ef60067cde-cni-binary-copy\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724753 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/23dab589-f077-4e94-93bc-392122228de4-tmp-dir\") pod \"node-resolver-pch4m\" (UID: \"23dab589-f077-4e94-93bc-392122228de4\") " pod="openshift-dns/node-resolver-pch4m" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724792 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/23dab589-f077-4e94-93bc-392122228de4-hosts-file\") pod \"node-resolver-pch4m\" (UID: \"23dab589-f077-4e94-93bc-392122228de4\") " pod="openshift-dns/node-resolver-pch4m" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724795 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-var-lib-kubelet\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724809 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-cnibin\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.725281 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724902 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-hostroot\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724932 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl4pz\" (UniqueName: \"kubernetes.io/projected/23dab589-f077-4e94-93bc-392122228de4-kube-api-access-hl4pz\") pod \"node-resolver-pch4m\" (UID: \"23dab589-f077-4e94-93bc-392122228de4\") " pod="openshift-dns/node-resolver-pch4m" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724960 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxgb\" (UniqueName: \"kubernetes.io/projected/afb0cf40-4c7d-4082-a5f2-64ef60067cde-kube-api-access-9cxgb\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.724991 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-run-multus-certs\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.725001 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-hostroot\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.725019 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/afb0cf40-4c7d-4082-a5f2-64ef60067cde-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.725017 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-multus-daemon-config\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.725058 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/afb0cf40-4c7d-4082-a5f2-64ef60067cde-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.725081 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-host-run-multus-certs\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.725138 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-cni-binary-copy\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.725302 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-system-cni-dir\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.725366 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afb0cf40-4c7d-4082-a5f2-64ef60067cde-system-cni-dir\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.726124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.725481 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/afb0cf40-4c7d-4082-a5f2-64ef60067cde-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.732974 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.732949 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxgb\" (UniqueName: \"kubernetes.io/projected/afb0cf40-4c7d-4082-a5f2-64ef60067cde-kube-api-access-9cxgb\") pod \"multus-additional-cni-plugins-5nrtn\" (UID: \"afb0cf40-4c7d-4082-a5f2-64ef60067cde\") " pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:56.733110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.732977 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdpmz\" (UniqueName: \"kubernetes.io/projected/8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2-kube-api-access-vdpmz\") pod \"multus-72h2h\" (UID: \"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2\") " pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.733110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.732980 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4vxj\" (UniqueName: \"kubernetes.io/projected/45be972b-ce44-43f8-9b8b-860260b4c7ab-kube-api-access-l4vxj\") pod \"iptables-alerter-sj6nb\" (UID: \"45be972b-ce44-43f8-9b8b-860260b4c7ab\") " pod="openshift-network-operator/iptables-alerter-sj6nb" Apr 17 17:23:56.733110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.733005 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl4pz\" (UniqueName: \"kubernetes.io/projected/23dab589-f077-4e94-93bc-392122228de4-kube-api-access-hl4pz\") pod \"node-resolver-pch4m\" (UID: \"23dab589-f077-4e94-93bc-392122228de4\") " pod="openshift-dns/node-resolver-pch4m" Apr 17 17:23:56.816119 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.816085 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:23:56.822888 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.822861 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:23:56.830286 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.830261 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" Apr 17 17:23:56.835824 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.835803 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" Apr 17 17:23:56.842404 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.842379 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8db7c" Apr 17 17:23:56.849031 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.849010 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sj6nb" Apr 17 17:23:56.856560 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.856535 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pch4m" Apr 17 17:23:56.864089 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.864065 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-72h2h" Apr 17 17:23:56.869712 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:56.869687 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" Apr 17 17:23:57.128382 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.128353 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:23:57.128594 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:57.128477 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:57.128594 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:57.128545 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs podName:1227f475-d747-4720-ad95-d72a46d6d1fb nodeName:}" failed. No retries permitted until 2026-04-17 17:23:58.128526849 +0000 UTC m=+4.099810172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs") pod "network-metrics-daemon-knvfd" (UID: "1227f475-d747-4720-ad95-d72a46d6d1fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:57.171190 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:57.171142 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f262b32_c02c_41bc_be72_1f8ea9896bfd.slice/crio-3ad32f9e905ec0d63d108c63c737cd1319241675b1f2acec2e02e83dc6f56f5e WatchSource:0}: Error finding container 3ad32f9e905ec0d63d108c63c737cd1319241675b1f2acec2e02e83dc6f56f5e: Status 404 returned error can't find the container with id 3ad32f9e905ec0d63d108c63c737cd1319241675b1f2acec2e02e83dc6f56f5e Apr 17 17:23:57.172354 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:57.172317 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43be0a75_8199_4093_a744_921df8b3380b.slice/crio-d9c313f65ccfccd63e07e0b795f2105258831cfa1e12fe981959a4f8b325eed9 WatchSource:0}: Error finding container d9c313f65ccfccd63e07e0b795f2105258831cfa1e12fe981959a4f8b325eed9: Status 404 returned error can't find the container with id d9c313f65ccfccd63e07e0b795f2105258831cfa1e12fe981959a4f8b325eed9 Apr 17 17:23:57.176475 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:57.176447 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45be972b_ce44_43f8_9b8b_860260b4c7ab.slice/crio-abcb384525a43cac81c763648ac02c5eda046aa3ca15a329be1b4f307c42b827 WatchSource:0}: Error finding container abcb384525a43cac81c763648ac02c5eda046aa3ca15a329be1b4f307c42b827: Status 404 returned error can't find the container with id abcb384525a43cac81c763648ac02c5eda046aa3ca15a329be1b4f307c42b827 Apr 17 17:23:57.177061 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:57.177031 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31b6121d_8e98_43cd_84cf_8f938f63e6bd.slice/crio-a25fc4ce7a753c4f76fd3b9ef31d502d437e207fd1d43ac7dfbbe1bba99aadb8 WatchSource:0}: Error finding container a25fc4ce7a753c4f76fd3b9ef31d502d437e207fd1d43ac7dfbbe1bba99aadb8: Status 404 returned error can't find the container with id a25fc4ce7a753c4f76fd3b9ef31d502d437e207fd1d43ac7dfbbe1bba99aadb8 Apr 17 17:23:57.178058 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:57.177987 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb0cf40_4c7d_4082_a5f2_64ef60067cde.slice/crio-b889610c6aebd405720ac3fc1f379b6324ab62972cb701331271acc4eecfb2a2 WatchSource:0}: Error finding container b889610c6aebd405720ac3fc1f379b6324ab62972cb701331271acc4eecfb2a2: Status 404 returned error can't find the container with id b889610c6aebd405720ac3fc1f379b6324ab62972cb701331271acc4eecfb2a2 Apr 17 17:23:57.179262 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:57.179242 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d2d7141_d6ee_4b25_bef5_a3cc8d5c413c.slice/crio-f2d403f2c40f1b7aa8df09b24745da2d4b539dbe867526b056ce0ce3df3690b9 WatchSource:0}: Error finding container f2d403f2c40f1b7aa8df09b24745da2d4b539dbe867526b056ce0ce3df3690b9: Status 404 returned error can't find the container with id f2d403f2c40f1b7aa8df09b24745da2d4b539dbe867526b056ce0ce3df3690b9 Apr 17 17:23:57.179637 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:57.179619 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f4942e8_dd1f_4333_b5fc_5aaeb1efedb2.slice/crio-e9667824c94dc65f5c5794897e74d7778af654cf2cf9019d4d16bf5ebf177e12 WatchSource:0}: Error finding container e9667824c94dc65f5c5794897e74d7778af654cf2cf9019d4d16bf5ebf177e12: Status 404 returned error can't find the container with id e9667824c94dc65f5c5794897e74d7778af654cf2cf9019d4d16bf5ebf177e12 Apr 17 17:23:57.184039 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:57.181603 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6afc6d79_46b9_4af3_84d9_3ed59a13c61a.slice/crio-adabafb592bbe34c8ae5855b6da349e044a61d1168e737da1356a533be789859 WatchSource:0}: Error finding container adabafb592bbe34c8ae5855b6da349e044a61d1168e737da1356a533be789859: Status 404 returned error can't find the container with id adabafb592bbe34c8ae5855b6da349e044a61d1168e737da1356a533be789859 Apr 17 17:23:57.184039 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:23:57.182774 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23dab589_f077_4e94_93bc_392122228de4.slice/crio-66f7092195e018c7d9242626c3f61ce788c7a900a4ea266d9c0cf66826ae34c9 WatchSource:0}: Error finding container 66f7092195e018c7d9242626c3f61ce788c7a900a4ea266d9c0cf66826ae34c9: Status 404 returned error can't find the container with id 66f7092195e018c7d9242626c3f61ce788c7a900a4ea266d9c0cf66826ae34c9 Apr 17 17:23:57.229602 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.229543 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdcb\" (UniqueName: \"kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb\") pod \"network-check-target-4qnzz\" (UID: \"850cf630-0fb1-482f-9e3d-a1525bdf6a39\") " pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:23:57.229747 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:57.229733 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:23:57.229803 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:57.229755 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:23:57.229803 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:57.229768 2580 projected.go:194] Error preparing data for projected volume kube-api-access-zvdcb for pod openshift-network-diagnostics/network-check-target-4qnzz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:57.229870 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:57.229829 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb podName:850cf630-0fb1-482f-9e3d-a1525bdf6a39 nodeName:}" failed. No retries permitted until 2026-04-17 17:23:58.229809536 +0000 UTC m=+4.201092871 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zvdcb" (UniqueName: "kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb") pod "network-check-target-4qnzz" (UID: "850cf630-0fb1-482f-9e3d-a1525bdf6a39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:57.562622 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.562454 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:18:55 +0000 UTC" deadline="2027-11-09 06:34:10.851033387 +0000 UTC" Apr 17 17:23:57.562622 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.562497 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13693h10m13.288540416s" Apr 17 17:23:57.637352 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.637118 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:23:57.637352 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:57.637243 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:23:57.653214 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.653175 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pch4m" event={"ID":"23dab589-f077-4e94-93bc-392122228de4","Type":"ContainerStarted","Data":"66f7092195e018c7d9242626c3f61ce788c7a900a4ea266d9c0cf66826ae34c9"} Apr 17 17:23:57.664961 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.664859 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8db7c" event={"ID":"9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c","Type":"ContainerStarted","Data":"f2d403f2c40f1b7aa8df09b24745da2d4b539dbe867526b056ce0ce3df3690b9"} Apr 17 17:23:57.672366 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.672329 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" event={"ID":"31b6121d-8e98-43cd-84cf-8f938f63e6bd","Type":"ContainerStarted","Data":"a25fc4ce7a753c4f76fd3b9ef31d502d437e207fd1d43ac7dfbbe1bba99aadb8"} Apr 17 17:23:57.676532 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.676244 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sj6nb" event={"ID":"45be972b-ce44-43f8-9b8b-860260b4c7ab","Type":"ContainerStarted","Data":"abcb384525a43cac81c763648ac02c5eda046aa3ca15a329be1b4f307c42b827"} Apr 17 17:23:57.679983 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.679952 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" event={"ID":"43be0a75-8199-4093-a744-921df8b3380b","Type":"ContainerStarted","Data":"d9c313f65ccfccd63e07e0b795f2105258831cfa1e12fe981959a4f8b325eed9"} Apr 17 17:23:57.685323 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.685291 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nz4ff" event={"ID":"3f262b32-c02c-41bc-be72-1f8ea9896bfd","Type":"ContainerStarted","Data":"3ad32f9e905ec0d63d108c63c737cd1319241675b1f2acec2e02e83dc6f56f5e"} Apr 17 17:23:57.698022 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.697419 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" event={"ID":"a5e99fc6db543cf6951686e44ee274cc","Type":"ContainerStarted","Data":"c93c2b7ed768ecc4e958a1a7dc1886effdeb0bc305a904e3bbf7193089b8977d"} Apr 17 17:23:57.701185 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.701151 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" event={"ID":"6afc6d79-46b9-4af3-84d9-3ed59a13c61a","Type":"ContainerStarted","Data":"adabafb592bbe34c8ae5855b6da349e044a61d1168e737da1356a533be789859"} Apr 17 17:23:57.711458 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.711418 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-72h2h" event={"ID":"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2","Type":"ContainerStarted","Data":"e9667824c94dc65f5c5794897e74d7778af654cf2cf9019d4d16bf5ebf177e12"} Apr 17 17:23:57.721794 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:57.721758 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" event={"ID":"afb0cf40-4c7d-4082-a5f2-64ef60067cde","Type":"ContainerStarted","Data":"b889610c6aebd405720ac3fc1f379b6324ab62972cb701331271acc4eecfb2a2"} Apr 17 17:23:58.136705 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:58.136668 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:23:58.136855 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:58.136811 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:58.136923 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:58.136878 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs podName:1227f475-d747-4720-ad95-d72a46d6d1fb nodeName:}" failed. No retries permitted until 2026-04-17 17:24:00.136858738 +0000 UTC m=+6.108142062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs") pod "network-metrics-daemon-knvfd" (UID: "1227f475-d747-4720-ad95-d72a46d6d1fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:58.237900 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:58.237864 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdcb\" (UniqueName: \"kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb\") pod \"network-check-target-4qnzz\" (UID: \"850cf630-0fb1-482f-9e3d-a1525bdf6a39\") " pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:23:58.238115 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:58.238058 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:23:58.238115 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:58.238081 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:23:58.238115 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:58.238095 2580 projected.go:194] Error preparing data for projected volume kube-api-access-zvdcb for pod openshift-network-diagnostics/network-check-target-4qnzz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:58.238291 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:58.238151 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb podName:850cf630-0fb1-482f-9e3d-a1525bdf6a39 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:00.238131705 +0000 UTC m=+6.209415037 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zvdcb" (UniqueName: "kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb") pod "network-check-target-4qnzz" (UID: "850cf630-0fb1-482f-9e3d-a1525bdf6a39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:58.640893 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:58.640786 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:23:58.641410 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:58.640932 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:23:58.729489 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:58.729451 2580 generic.go:358] "Generic (PLEG): container finished" podID="62d93d1676550c945f317175b90f5b9f" containerID="96ef7ff3188e84a4a7fe03a0bd5af8939411c7b720eaf94634d17d08d3969bac" exitCode=0 Apr 17 17:23:58.730638 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:58.730398 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" event={"ID":"62d93d1676550c945f317175b90f5b9f","Type":"ContainerDied","Data":"96ef7ff3188e84a4a7fe03a0bd5af8939411c7b720eaf94634d17d08d3969bac"} Apr 17 17:23:58.749430 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:58.749364 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" podStartSLOduration=3.749344771 podStartE2EDuration="3.749344771s" podCreationTimestamp="2026-04-17 17:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:23:57.714800376 +0000 UTC m=+3.686083718" watchObservedRunningTime="2026-04-17 17:23:58.749344771 +0000 UTC m=+4.720628114" Apr 17 17:23:59.637060 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:59.637021 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:23:59.637236 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:23:59.637146 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:23:59.736383 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:59.736342 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" event={"ID":"62d93d1676550c945f317175b90f5b9f","Type":"ContainerStarted","Data":"5f2437d30bd27e65c2b5fe3b1e98b258df9b9e084dd1a1c1131ea3b8c21ac825"} Apr 17 17:23:59.750688 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:23:59.749832 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" podStartSLOduration=4.749814868 podStartE2EDuration="4.749814868s" podCreationTimestamp="2026-04-17 17:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:23:59.749207371 +0000 UTC m=+5.720490732" watchObservedRunningTime="2026-04-17 17:23:59.749814868 +0000 UTC m=+5.721098209" Apr 17 17:24:00.155677 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.155636 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:00.155901 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:00.155875 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:00.155970 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:00.155962 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs podName:1227f475-d747-4720-ad95-d72a46d6d1fb nodeName:}" failed. No retries permitted until 2026-04-17 17:24:04.155938653 +0000 UTC m=+10.127221987 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs") pod "network-metrics-daemon-knvfd" (UID: "1227f475-d747-4720-ad95-d72a46d6d1fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:00.257247 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.257061 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdcb\" (UniqueName: \"kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb\") pod \"network-check-target-4qnzz\" (UID: \"850cf630-0fb1-482f-9e3d-a1525bdf6a39\") " pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:00.257429 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:00.257279 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:00.257429 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:00.257307 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:00.257429 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:00.257320 2580 projected.go:194] Error preparing data for projected volume kube-api-access-zvdcb for pod openshift-network-diagnostics/network-check-target-4qnzz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:00.257429 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:00.257382 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb podName:850cf630-0fb1-482f-9e3d-a1525bdf6a39 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:04.257362872 +0000 UTC m=+10.228646199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zvdcb" (UniqueName: "kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb") pod "network-check-target-4qnzz" (UID: "850cf630-0fb1-482f-9e3d-a1525bdf6a39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:00.435268 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.435190 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8cqlg"] Apr 17 17:24:00.438155 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.438106 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:00.438292 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:00.438181 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:00.559880 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.559841 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9458a330-4a73-457d-a605-d7998538c01b-dbus\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:00.560042 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.559958 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9458a330-4a73-457d-a605-d7998538c01b-kubelet-config\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:00.560042 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.559989 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:00.636955 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.636913 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:00.637108 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:00.637045 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:00.661694 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.660820 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9458a330-4a73-457d-a605-d7998538c01b-kubelet-config\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:00.661694 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.660886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:00.661694 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.660931 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9458a330-4a73-457d-a605-d7998538c01b-dbus\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:00.661694 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.661144 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9458a330-4a73-457d-a605-d7998538c01b-dbus\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:00.661694 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:00.661208 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9458a330-4a73-457d-a605-d7998538c01b-kubelet-config\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:00.661694 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:00.661298 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:00.661694 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:00.661390 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret podName:9458a330-4a73-457d-a605-d7998538c01b nodeName:}" failed. No retries permitted until 2026-04-17 17:24:01.161337717 +0000 UTC m=+7.132621042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret") pod "global-pull-secret-syncer-8cqlg" (UID: "9458a330-4a73-457d-a605-d7998538c01b") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:01.166240 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:01.166197 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:01.166672 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:01.166369 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:01.166672 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:01.166437 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret podName:9458a330-4a73-457d-a605-d7998538c01b nodeName:}" failed. No retries permitted until 2026-04-17 17:24:02.166418728 +0000 UTC m=+8.137702061 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret") pod "global-pull-secret-syncer-8cqlg" (UID: "9458a330-4a73-457d-a605-d7998538c01b") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:01.636772 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:01.636731 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:01.636965 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:01.636874 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:02.174847 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:02.174745 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:02.175298 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:02.174884 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:02.175298 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:02.174953 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret podName:9458a330-4a73-457d-a605-d7998538c01b nodeName:}" failed. No retries permitted until 2026-04-17 17:24:04.174934353 +0000 UTC m=+10.146217687 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret") pod "global-pull-secret-syncer-8cqlg" (UID: "9458a330-4a73-457d-a605-d7998538c01b") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:02.636699 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:02.636661 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:02.636884 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:02.636801 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:02.636884 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:02.636661 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:02.637007 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:02.636978 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:03.637163 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:03.636884 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:03.637163 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:03.636985 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:04.193776 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:04.193140 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:04.193776 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:04.193198 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:04.193776 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:04.193369 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:04.193776 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:04.193431 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret podName:9458a330-4a73-457d-a605-d7998538c01b nodeName:}" failed. No retries permitted until 2026-04-17 17:24:08.193411924 +0000 UTC m=+14.164695268 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret") pod "global-pull-secret-syncer-8cqlg" (UID: "9458a330-4a73-457d-a605-d7998538c01b") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:04.193776 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:04.193625 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:04.193776 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:04.193684 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs podName:1227f475-d747-4720-ad95-d72a46d6d1fb nodeName:}" failed. No retries permitted until 2026-04-17 17:24:12.193669302 +0000 UTC m=+18.164952638 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs") pod "network-metrics-daemon-knvfd" (UID: "1227f475-d747-4720-ad95-d72a46d6d1fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:04.293878 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:04.293842 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdcb\" (UniqueName: \"kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb\") pod \"network-check-target-4qnzz\" (UID: \"850cf630-0fb1-482f-9e3d-a1525bdf6a39\") " pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:04.294077 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:04.294050 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:04.294150 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:04.294078 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:04.294150 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:04.294092 2580 projected.go:194] Error preparing data for projected volume kube-api-access-zvdcb for pod openshift-network-diagnostics/network-check-target-4qnzz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:04.294150 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:04.294142 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb podName:850cf630-0fb1-482f-9e3d-a1525bdf6a39 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:12.29412748 +0000 UTC m=+18.265410812 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zvdcb" (UniqueName: "kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb") pod "network-check-target-4qnzz" (UID: "850cf630-0fb1-482f-9e3d-a1525bdf6a39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:04.639342 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:04.639280 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:04.639794 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:04.639473 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:04.639794 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:04.639527 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:04.639794 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:04.639703 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:05.636404 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:05.636341 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:05.636572 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:05.636470 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:06.636460 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:06.636429 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:06.636904 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:06.636554 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:06.636904 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:06.636602 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:06.636904 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:06.636658 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:07.636771 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:07.636735 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:07.637211 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:07.636877 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:08.226476 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:08.226411 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:08.226657 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:08.226553 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:08.226657 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:08.226654 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret podName:9458a330-4a73-457d-a605-d7998538c01b nodeName:}" failed. No retries permitted until 2026-04-17 17:24:16.226631334 +0000 UTC m=+22.197914675 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret") pod "global-pull-secret-syncer-8cqlg" (UID: "9458a330-4a73-457d-a605-d7998538c01b") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:08.636463 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:08.636425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:08.636661 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:08.636433 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:08.636661 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:08.636570 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:08.636783 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:08.636670 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:09.636458 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:09.636424 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:09.636927 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:09.636547 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:10.636280 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:10.636244 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:10.636487 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:10.636250 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:10.636487 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:10.636382 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:10.636487 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:10.636473 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:11.636307 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:11.636270 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:11.636524 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:11.636386 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:12.255126 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:12.255083 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:12.255294 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:12.255235 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:12.255344 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:12.255313 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs podName:1227f475-d747-4720-ad95-d72a46d6d1fb nodeName:}" failed. No retries permitted until 2026-04-17 17:24:28.25529385 +0000 UTC m=+34.226577193 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs") pod "network-metrics-daemon-knvfd" (UID: "1227f475-d747-4720-ad95-d72a46d6d1fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:12.355738 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:12.355694 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdcb\" (UniqueName: \"kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb\") pod \"network-check-target-4qnzz\" (UID: \"850cf630-0fb1-482f-9e3d-a1525bdf6a39\") " pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:12.355921 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:12.355877 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:12.355921 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:12.355903 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:12.355921 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:12.355918 2580 projected.go:194] Error preparing data for projected volume kube-api-access-zvdcb for pod openshift-network-diagnostics/network-check-target-4qnzz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:12.356077 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:12.355985 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb podName:850cf630-0fb1-482f-9e3d-a1525bdf6a39 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:28.35596587 +0000 UTC m=+34.327249206 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zvdcb" (UniqueName: "kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb") pod "network-check-target-4qnzz" (UID: "850cf630-0fb1-482f-9e3d-a1525bdf6a39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:12.636953 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:12.636918 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:12.637399 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:12.636918 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:12.637399 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:12.637048 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:12.637399 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:12.637131 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:13.637057 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:13.637014 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:13.637449 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:13.637150 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:14.637317 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.637138 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:14.637963 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.637196 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:14.637963 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:14.637389 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:14.637963 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:14.637503 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:14.761904 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.761873 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nz4ff" event={"ID":"3f262b32-c02c-41bc-be72-1f8ea9896bfd","Type":"ContainerStarted","Data":"e713cc63cdbcc1303e0e73cbef1cc48bf5e3996c09950fa25d7a26e85b4579b5"} Apr 17 17:24:14.763330 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.763312 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:24:14.763565 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.763546 2580 generic.go:358] "Generic (PLEG): container finished" podID="6afc6d79-46b9-4af3-84d9-3ed59a13c61a" containerID="e88701741f76565a285a939f2cc2e11a7c5fe9f292af2c086b82725dff83b2a5" exitCode=1 Apr 17 17:24:14.763662 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.763604 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" event={"ID":"6afc6d79-46b9-4af3-84d9-3ed59a13c61a","Type":"ContainerStarted","Data":"8ecba6167e6285d9786ae2b443c40d10f507420bc3cc517eb404a797d7ee06d4"} Apr 17 17:24:14.763662 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.763628 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" event={"ID":"6afc6d79-46b9-4af3-84d9-3ed59a13c61a","Type":"ContainerDied","Data":"e88701741f76565a285a939f2cc2e11a7c5fe9f292af2c086b82725dff83b2a5"} Apr 17 17:24:14.763662 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.763639 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" event={"ID":"6afc6d79-46b9-4af3-84d9-3ed59a13c61a","Type":"ContainerStarted","Data":"fabfa3c7b2a40cc3507c7a437914440a1890b67f5cbe1cd788191d7197ad201b"} Apr 17 17:24:14.764622 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.764599 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-72h2h" event={"ID":"8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2","Type":"ContainerStarted","Data":"f3f188efd068c15d1730d94cc3b5239d17c8216d6fa5c22eef0ffd33a990896f"} Apr 17 17:24:14.765745 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.765724 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" event={"ID":"afb0cf40-4c7d-4082-a5f2-64ef60067cde","Type":"ContainerStarted","Data":"394c93031fda04da78d8e62d426c5c2897d0e6b2f62941a655e6e9cb5494a8c2"} Apr 17 17:24:14.767016 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.766996 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pch4m" event={"ID":"23dab589-f077-4e94-93bc-392122228de4","Type":"ContainerStarted","Data":"1596ed23d5c6227a3cbc67f04bd79e9ed61e200876fdfb6c1e4af4309e8db792"} Apr 17 17:24:14.768308 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.768279 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8db7c" event={"ID":"9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c","Type":"ContainerStarted","Data":"979e8490ae816793c42c2d046552a888f5ea30f187c164768d7b1a6aa64a89a1"} Apr 17 17:24:14.769850 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.769825 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" event={"ID":"31b6121d-8e98-43cd-84cf-8f938f63e6bd","Type":"ContainerStarted","Data":"7be68ae140d259dd20bdd5b4c71921ef01ee17b127688cedff6e376f1845b494"} Apr 17 17:24:14.771150 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.771131 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" event={"ID":"43be0a75-8199-4093-a744-921df8b3380b","Type":"ContainerStarted","Data":"bff6082bb752063613ca3aebd1f221848c33ce784c01fe673e8f3ebb18da0262"} Apr 17 17:24:14.777369 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.777322 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nz4ff" podStartSLOduration=4.119487491 podStartE2EDuration="20.777306356s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:23:57.173795212 +0000 UTC m=+3.145078531" lastFinishedPulling="2026-04-17 17:24:13.831614068 +0000 UTC m=+19.802897396" observedRunningTime="2026-04-17 17:24:14.776820736 +0000 UTC m=+20.748104078" watchObservedRunningTime="2026-04-17 17:24:14.777306356 +0000 UTC m=+20.748589724" Apr 17 17:24:14.793551 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.793501 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8s5r4" podStartSLOduration=3.748941501 podStartE2EDuration="20.793484839s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:23:57.179399902 +0000 UTC m=+3.150683226" lastFinishedPulling="2026-04-17 17:24:14.22394324 +0000 UTC m=+20.195226564" observedRunningTime="2026-04-17 17:24:14.793246828 +0000 UTC m=+20.764530171" watchObservedRunningTime="2026-04-17 17:24:14.793484839 +0000 UTC m=+20.764768220" Apr 17 17:24:14.807693 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.807652 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8db7c" podStartSLOduration=3.7655828380000003 podStartE2EDuration="20.807638041s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:23:57.181885906 +0000 UTC m=+3.153169237" lastFinishedPulling="2026-04-17 17:24:14.223941108 +0000 UTC m=+20.195224440" observedRunningTime="2026-04-17 17:24:14.807279761 +0000 UTC m=+20.778563107" watchObservedRunningTime="2026-04-17 17:24:14.807638041 +0000 UTC m=+20.778921382" Apr 17 17:24:14.850866 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.850549 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pch4m" podStartSLOduration=3.812414838 podStartE2EDuration="20.85052705s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:23:57.185846435 +0000 UTC m=+3.157129757" lastFinishedPulling="2026-04-17 17:24:14.22395864 +0000 UTC m=+20.195241969" observedRunningTime="2026-04-17 17:24:14.849836167 +0000 UTC m=+20.821119509" watchObservedRunningTime="2026-04-17 17:24:14.85052705 +0000 UTC m=+20.821810392" Apr 17 17:24:14.869927 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:14.869876 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-72h2h" podStartSLOduration=3.792747639 podStartE2EDuration="20.869863158s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:23:57.183778427 +0000 UTC m=+3.155061753" lastFinishedPulling="2026-04-17 17:24:14.260893948 +0000 UTC m=+20.232177272" observedRunningTime="2026-04-17 17:24:14.869532607 +0000 UTC m=+20.840815947" watchObservedRunningTime="2026-04-17 17:24:14.869863158 +0000 UTC m=+20.841146498" Apr 17 17:24:15.636335 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.636313 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:15.636439 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:15.636411 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:15.702561 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.702403 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:24:15.774526 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.774490 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sj6nb" event={"ID":"45be972b-ce44-43f8-9b8b-860260b4c7ab","Type":"ContainerStarted","Data":"a5b446730496218d883f26d5f7c2ab509b97f68998d6df7ddca2e5a1d596a83a"} Apr 17 17:24:15.775998 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.775975 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" event={"ID":"43be0a75-8199-4093-a744-921df8b3380b","Type":"ContainerStarted","Data":"573aec39ca92a7f3f701dbb7cca8e8d7720a035b7e97a0cc4aabff756ad268ae"} Apr 17 17:24:15.778138 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.778121 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:24:15.778461 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.778441 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" event={"ID":"6afc6d79-46b9-4af3-84d9-3ed59a13c61a","Type":"ContainerStarted","Data":"80ec721b385280bf50e7e11472254e73f01a2e2fea437b942dae84bb557faf57"} Apr 17 17:24:15.778571 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.778465 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" event={"ID":"6afc6d79-46b9-4af3-84d9-3ed59a13c61a","Type":"ContainerStarted","Data":"39d8f8a1686168d380c99b1b9a62cf9f7dd2ba2eb705152e6544be70889ce849"} Apr 17 17:24:15.778571 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.778479 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" event={"ID":"6afc6d79-46b9-4af3-84d9-3ed59a13c61a","Type":"ContainerStarted","Data":"2331232130baea0a900c1dea4708d44a58a8a52b6ae37d0352da48a72e46bec6"} Apr 17 17:24:15.779771 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.779749 2580 generic.go:358] "Generic (PLEG): container finished" podID="afb0cf40-4c7d-4082-a5f2-64ef60067cde" containerID="394c93031fda04da78d8e62d426c5c2897d0e6b2f62941a655e6e9cb5494a8c2" exitCode=0 Apr 17 17:24:15.779859 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.779833 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" event={"ID":"afb0cf40-4c7d-4082-a5f2-64ef60067cde","Type":"ContainerDied","Data":"394c93031fda04da78d8e62d426c5c2897d0e6b2f62941a655e6e9cb5494a8c2"} Apr 17 17:24:15.789790 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.789755 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-sj6nb" podStartSLOduration=4.743835479 podStartE2EDuration="21.78974327s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:23:57.178028168 +0000 UTC m=+3.149311488" lastFinishedPulling="2026-04-17 17:24:14.223935959 +0000 UTC m=+20.195219279" observedRunningTime="2026-04-17 17:24:15.789168842 +0000 UTC m=+21.760452183" watchObservedRunningTime="2026-04-17 17:24:15.78974327 +0000 UTC m=+21.761026623" Apr 17 17:24:15.974057 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.974022 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:24:15.974612 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:15.974593 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:24:16.282551 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:16.282469 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:16.282730 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:16.282656 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:16.282730 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:16.282719 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret podName:9458a330-4a73-457d-a605-d7998538c01b nodeName:}" failed. No retries permitted until 2026-04-17 17:24:32.282700227 +0000 UTC m=+38.253983552 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret") pod "global-pull-secret-syncer-8cqlg" (UID: "9458a330-4a73-457d-a605-d7998538c01b") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:16.583321 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:16.583215 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:24:15.702554337Z","UUID":"d850c461-a58b-4701-912f-0e28730d9a2a","Handler":null,"Name":"","Endpoint":""} Apr 17 17:24:16.585043 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:16.585011 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:24:16.585043 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:16.585042 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:24:16.637194 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:16.636956 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:16.637194 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:16.637010 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:16.637194 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:16.637099 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:16.637489 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:16.637222 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:16.784206 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:16.784167 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" event={"ID":"43be0a75-8199-4093-a744-921df8b3380b","Type":"ContainerStarted","Data":"27aa83821736417e4677197b8f414e6595b8718885583eaa59f984ce5bba0d2d"} Apr 17 17:24:17.636500 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:17.636468 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:17.636714 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:17.636607 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:17.789393 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:17.789360 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:24:17.790077 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:17.789767 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" event={"ID":"6afc6d79-46b9-4af3-84d9-3ed59a13c61a","Type":"ContainerStarted","Data":"2e65ad96d65bd8611940d214c051288d23158aafdca3c70ab962216de2498b10"} Apr 17 17:24:17.790077 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:17.789864 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 17:24:18.637020 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:18.636978 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:18.637214 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:18.636992 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:18.637214 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:18.637120 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:18.637329 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:18.637214 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:19.636883 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:19.636855 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:19.637339 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:19.636953 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:19.798139 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:19.797926 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:24:19.798621 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:19.798592 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" event={"ID":"6afc6d79-46b9-4af3-84d9-3ed59a13c61a","Type":"ContainerStarted","Data":"ba34dda09e69426e0f4cb6d40bf586faff842b0c9cd1442a640452bdf6874e44"} Apr 17 17:24:19.798961 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:19.798929 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:24:19.798961 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:19.798959 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:24:19.799094 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:19.798971 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:24:19.799223 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:19.799175 2580 scope.go:117] "RemoveContainer" containerID="e88701741f76565a285a939f2cc2e11a7c5fe9f292af2c086b82725dff83b2a5" Apr 17 17:24:19.801523 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:19.801487 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" event={"ID":"afb0cf40-4c7d-4082-a5f2-64ef60067cde","Type":"ContainerStarted","Data":"f86ac19f7b3aea133c7e59cdd859e442011eea9ce90ff154ee740a10cdb397be"} Apr 17 17:24:19.815309 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:19.815282 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:24:19.817400 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:19.817371 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:24:19.831354 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:19.831291 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-khc69" podStartSLOduration=6.415602765 podStartE2EDuration="25.831271969s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:23:57.174614709 +0000 UTC m=+3.145898028" lastFinishedPulling="2026-04-17 17:24:16.590283913 +0000 UTC m=+22.561567232" observedRunningTime="2026-04-17 17:24:16.808858732 +0000 UTC m=+22.780142076" watchObservedRunningTime="2026-04-17 17:24:19.831271969 +0000 UTC m=+25.802555308" Apr 17 17:24:20.636412 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:20.636381 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:20.636550 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:20.636381 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:20.636550 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:20.636521 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:20.636676 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:20.636547 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:20.805694 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:20.805670 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:24:20.806080 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:20.805985 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" event={"ID":"6afc6d79-46b9-4af3-84d9-3ed59a13c61a","Type":"ContainerStarted","Data":"397f2b808e13bc19ff8839f57a0a8bc410e3f933679027c8a73fa32d84bbb64b"} Apr 17 17:24:20.807564 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:20.807538 2580 generic.go:358] "Generic (PLEG): container finished" podID="afb0cf40-4c7d-4082-a5f2-64ef60067cde" containerID="f86ac19f7b3aea133c7e59cdd859e442011eea9ce90ff154ee740a10cdb397be" exitCode=0 Apr 17 17:24:20.807677 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:20.807570 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" event={"ID":"afb0cf40-4c7d-4082-a5f2-64ef60067cde","Type":"ContainerDied","Data":"f86ac19f7b3aea133c7e59cdd859e442011eea9ce90ff154ee740a10cdb397be"} Apr 17 17:24:20.837503 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:20.837454 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" podStartSLOduration=9.717841573 podStartE2EDuration="26.837441056s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:23:57.183797227 +0000 UTC m=+3.155080554" lastFinishedPulling="2026-04-17 17:24:14.303396711 +0000 UTC m=+20.274680037" observedRunningTime="2026-04-17 17:24:20.836021299 +0000 UTC m=+26.807304639" watchObservedRunningTime="2026-04-17 17:24:20.837441056 +0000 UTC m=+26.808724396" Apr 17 17:24:21.612683 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:21.612486 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4qnzz"] Apr 17 17:24:21.612822 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:21.612778 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:21.612898 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:21.612856 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:21.616087 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:21.616058 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8cqlg"] Apr 17 17:24:21.616217 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:21.616174 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:21.616294 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:21.616273 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:21.616681 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:21.616660 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-knvfd"] Apr 17 17:24:21.616792 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:21.616746 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:21.616841 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:21.616821 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:21.811873 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:21.811782 2580 generic.go:358] "Generic (PLEG): container finished" podID="afb0cf40-4c7d-4082-a5f2-64ef60067cde" containerID="b3f005f171786356d8cbd0fd6672a4cdef250ff3d0f5ce90c87784b1f563972d" exitCode=0 Apr 17 17:24:21.811873 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:21.811865 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" event={"ID":"afb0cf40-4c7d-4082-a5f2-64ef60067cde","Type":"ContainerDied","Data":"b3f005f171786356d8cbd0fd6672a4cdef250ff3d0f5ce90c87784b1f563972d"} Apr 17 17:24:22.815774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:22.815691 2580 generic.go:358] "Generic (PLEG): container finished" podID="afb0cf40-4c7d-4082-a5f2-64ef60067cde" containerID="946a74e3e172f93e44629152d9a6ba2fa6ae0e0686916168b7f1d6cfa7609943" exitCode=0 Apr 17 17:24:22.815774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:22.815746 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" event={"ID":"afb0cf40-4c7d-4082-a5f2-64ef60067cde","Type":"ContainerDied","Data":"946a74e3e172f93e44629152d9a6ba2fa6ae0e0686916168b7f1d6cfa7609943"} Apr 17 17:24:23.637023 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:23.636992 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:23.637182 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:23.637095 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:23.637300 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:23.637278 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:23.637411 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:23.637389 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:23.637507 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:23.637478 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:23.637642 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:23.637617 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:25.447204 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:25.447169 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:24:25.447916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:25.447329 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 17:24:25.448166 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:25.448137 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nz4ff" Apr 17 17:24:25.636682 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:25.636647 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:25.636870 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:25.636760 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8cqlg" podUID="9458a330-4a73-457d-a605-d7998538c01b" Apr 17 17:24:25.636870 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:25.636772 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:25.636870 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:25.636862 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qnzz" podUID="850cf630-0fb1-482f-9e3d-a1525bdf6a39" Apr 17 17:24:25.637024 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:25.636901 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:25.637024 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:25.636975 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:24:27.362397 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.362365 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeReady" Apr 17 17:24:27.363016 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.362512 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:24:27.408021 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.407992 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-h5vmx"] Apr 17 17:24:27.411351 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.411328 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-l6nv9"] Apr 17 17:24:27.411497 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.411483 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.413534 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.413515 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:27.413913 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.413887 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:24:27.413913 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.413905 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:24:27.414090 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.413915 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qbw75\"" Apr 17 17:24:27.416244 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.416224 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:24:27.416366 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.416341 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:24:27.416469 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.416452 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hd29f\"" Apr 17 17:24:27.416556 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.416541 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:24:27.420907 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.420874 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h5vmx"] Apr 17 17:24:27.422661 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.422637 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l6nv9"] Apr 17 17:24:27.568271 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.568238 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c063b8d8-8182-438f-a272-69a64fcbb153-config-volume\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.568450 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.568299 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2glwl\" (UniqueName: \"kubernetes.io/projected/1586f132-dd9c-4636-a7c7-87b1b730dc01-kube-api-access-2glwl\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:27.568450 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.568341 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c063b8d8-8182-438f-a272-69a64fcbb153-tmp-dir\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.568450 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.568403 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.568610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.568452 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:27.568610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.568478 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z45pm\" (UniqueName: \"kubernetes.io/projected/c063b8d8-8182-438f-a272-69a64fcbb153-kube-api-access-z45pm\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.636845 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.636765 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:27.636845 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.636791 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:27.637065 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.636764 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:27.640808 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.640775 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:24:27.641088 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.640776 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:24:27.641457 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.641437 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:24:27.641534 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.641492 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zp45b\"" Apr 17 17:24:27.641534 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.641510 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:24:27.642084 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.642067 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7c8rp\"" Apr 17 17:24:27.669375 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.669344 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c063b8d8-8182-438f-a272-69a64fcbb153-tmp-dir\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.669553 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.669411 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.669553 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.669452 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:27.669553 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.669475 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z45pm\" (UniqueName: \"kubernetes.io/projected/c063b8d8-8182-438f-a272-69a64fcbb153-kube-api-access-z45pm\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.669553 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.669523 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c063b8d8-8182-438f-a272-69a64fcbb153-config-volume\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.669784 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.669559 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2glwl\" (UniqueName: \"kubernetes.io/projected/1586f132-dd9c-4636-a7c7-87b1b730dc01-kube-api-access-2glwl\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:27.669784 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:27.669600 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:27.669784 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:27.669613 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:27.669784 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:27.669681 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert podName:1586f132-dd9c-4636-a7c7-87b1b730dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:28.169660265 +0000 UTC m=+34.140943584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert") pod "ingress-canary-l6nv9" (UID: "1586f132-dd9c-4636-a7c7-87b1b730dc01") : secret "canary-serving-cert" not found Apr 17 17:24:27.669784 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:27.669699 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls podName:c063b8d8-8182-438f-a272-69a64fcbb153 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:28.1696906 +0000 UTC m=+34.140973921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls") pod "dns-default-h5vmx" (UID: "c063b8d8-8182-438f-a272-69a64fcbb153") : secret "dns-default-metrics-tls" not found Apr 17 17:24:27.669784 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.669773 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c063b8d8-8182-438f-a272-69a64fcbb153-tmp-dir\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.670180 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.670160 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c063b8d8-8182-438f-a272-69a64fcbb153-config-volume\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.681160 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.681007 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z45pm\" (UniqueName: \"kubernetes.io/projected/c063b8d8-8182-438f-a272-69a64fcbb153-kube-api-access-z45pm\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:27.681283 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:27.681101 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2glwl\" (UniqueName: \"kubernetes.io/projected/1586f132-dd9c-4636-a7c7-87b1b730dc01-kube-api-access-2glwl\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:28.174068 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:28.174030 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:28.174068 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:28.174082 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:28.174291 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:28.174192 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:28.174291 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:28.174194 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:28.174291 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:28.174266 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert podName:1586f132-dd9c-4636-a7c7-87b1b730dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:29.174250063 +0000 UTC m=+35.145533388 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert") pod "ingress-canary-l6nv9" (UID: "1586f132-dd9c-4636-a7c7-87b1b730dc01") : secret "canary-serving-cert" not found Apr 17 17:24:28.174291 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:28.174279 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls podName:c063b8d8-8182-438f-a272-69a64fcbb153 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:29.174273413 +0000 UTC m=+35.145556731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls") pod "dns-default-h5vmx" (UID: "c063b8d8-8182-438f-a272-69a64fcbb153") : secret "dns-default-metrics-tls" not found Apr 17 17:24:28.275324 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:28.275289 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:24:28.275507 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:28.275425 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:24:28.275507 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:28.275496 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs podName:1227f475-d747-4720-ad95-d72a46d6d1fb nodeName:}" failed. No retries permitted until 2026-04-17 17:25:00.27547709 +0000 UTC m=+66.246760409 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs") pod "network-metrics-daemon-knvfd" (UID: "1227f475-d747-4720-ad95-d72a46d6d1fb") : secret "metrics-daemon-secret" not found Apr 17 17:24:28.377076 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:28.376272 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdcb\" (UniqueName: \"kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb\") pod \"network-check-target-4qnzz\" (UID: \"850cf630-0fb1-482f-9e3d-a1525bdf6a39\") " pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:28.379870 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:28.379847 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvdcb\" (UniqueName: \"kubernetes.io/projected/850cf630-0fb1-482f-9e3d-a1525bdf6a39-kube-api-access-zvdcb\") pod \"network-check-target-4qnzz\" (UID: \"850cf630-0fb1-482f-9e3d-a1525bdf6a39\") " pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:28.556132 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:28.556078 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:28.705489 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:28.705425 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4qnzz"] Apr 17 17:24:28.709829 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:24:28.709794 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod850cf630_0fb1_482f_9e3d_a1525bdf6a39.slice/crio-fb9868aa90cc46ac2b4f2ad4f1ac55b5acaa2666b0d64f040146cb7ab1aa72a1 WatchSource:0}: Error finding container fb9868aa90cc46ac2b4f2ad4f1ac55b5acaa2666b0d64f040146cb7ab1aa72a1: Status 404 returned error can't find the container with id fb9868aa90cc46ac2b4f2ad4f1ac55b5acaa2666b0d64f040146cb7ab1aa72a1 Apr 17 17:24:28.830110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:28.830079 2580 generic.go:358] "Generic (PLEG): container finished" podID="afb0cf40-4c7d-4082-a5f2-64ef60067cde" containerID="9a109c9ff651445afca38cee411afccaf2f56808ad97ce665cbcbf0c8833a0ed" exitCode=0 Apr 17 17:24:28.830264 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:28.830165 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" event={"ID":"afb0cf40-4c7d-4082-a5f2-64ef60067cde","Type":"ContainerDied","Data":"9a109c9ff651445afca38cee411afccaf2f56808ad97ce665cbcbf0c8833a0ed"} Apr 17 17:24:28.831217 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:28.831197 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4qnzz" event={"ID":"850cf630-0fb1-482f-9e3d-a1525bdf6a39","Type":"ContainerStarted","Data":"fb9868aa90cc46ac2b4f2ad4f1ac55b5acaa2666b0d64f040146cb7ab1aa72a1"} Apr 17 17:24:29.181917 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:29.181882 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:29.182119 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:29.181935 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:29.182119 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:29.182059 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:29.182119 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:29.182091 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:29.182274 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:29.182141 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls podName:c063b8d8-8182-438f-a272-69a64fcbb153 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:31.182118546 +0000 UTC m=+37.153401866 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls") pod "dns-default-h5vmx" (UID: "c063b8d8-8182-438f-a272-69a64fcbb153") : secret "dns-default-metrics-tls" not found Apr 17 17:24:29.182274 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:29.182160 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert podName:1586f132-dd9c-4636-a7c7-87b1b730dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:31.182150659 +0000 UTC m=+37.153433978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert") pod "ingress-canary-l6nv9" (UID: "1586f132-dd9c-4636-a7c7-87b1b730dc01") : secret "canary-serving-cert" not found Apr 17 17:24:29.835821 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:29.835785 2580 generic.go:358] "Generic (PLEG): container finished" podID="afb0cf40-4c7d-4082-a5f2-64ef60067cde" containerID="8d8bc031e234d24a899e21280ac78a53eadd1ce26a8f0ad55637de9add3b3958" exitCode=0 Apr 17 17:24:29.836277 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:29.835831 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" event={"ID":"afb0cf40-4c7d-4082-a5f2-64ef60067cde","Type":"ContainerDied","Data":"8d8bc031e234d24a899e21280ac78a53eadd1ce26a8f0ad55637de9add3b3958"} Apr 17 17:24:30.841437 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:30.841167 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" event={"ID":"afb0cf40-4c7d-4082-a5f2-64ef60067cde","Type":"ContainerStarted","Data":"bb09ad953cad53706a1ca7c4cb60fa601d358efa8a631c3fa906fc9d7e6e08a3"} Apr 17 17:24:30.866659 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:30.866575 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5nrtn" podStartSLOduration=5.695273427 podStartE2EDuration="36.866558033s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:23:57.181001511 +0000 UTC m=+3.152284853" lastFinishedPulling="2026-04-17 17:24:28.352286132 +0000 UTC m=+34.323569459" observedRunningTime="2026-04-17 17:24:30.864905414 +0000 UTC m=+36.836188756" watchObservedRunningTime="2026-04-17 17:24:30.866558033 +0000 UTC m=+36.837841375" Apr 17 17:24:31.199640 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:31.199598 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:31.199813 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:31.199693 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:31.199813 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:31.199746 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:31.199813 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:31.199773 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:31.199980 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:31.199826 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert podName:1586f132-dd9c-4636-a7c7-87b1b730dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:35.199809221 +0000 UTC m=+41.171092562 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert") pod "ingress-canary-l6nv9" (UID: "1586f132-dd9c-4636-a7c7-87b1b730dc01") : secret "canary-serving-cert" not found Apr 17 17:24:31.199980 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:31.199842 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls podName:c063b8d8-8182-438f-a272-69a64fcbb153 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:35.199836659 +0000 UTC m=+41.171119978 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls") pod "dns-default-h5vmx" (UID: "c063b8d8-8182-438f-a272-69a64fcbb153") : secret "dns-default-metrics-tls" not found Apr 17 17:24:31.844476 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:31.844437 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4qnzz" event={"ID":"850cf630-0fb1-482f-9e3d-a1525bdf6a39","Type":"ContainerStarted","Data":"b15a071a56b5339f7cf808d143b2c333fd15a9221506431d6fd7f6454cfa44e8"} Apr 17 17:24:31.845124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:31.844786 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:24:31.863874 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:31.863829 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4qnzz" podStartSLOduration=34.880872327 podStartE2EDuration="37.863796778s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:24:28.71186067 +0000 UTC m=+34.683144003" lastFinishedPulling="2026-04-17 17:24:31.694785131 +0000 UTC m=+37.666068454" observedRunningTime="2026-04-17 17:24:31.8626626 +0000 UTC m=+37.833945940" watchObservedRunningTime="2026-04-17 17:24:31.863796778 +0000 UTC m=+37.835080160" Apr 17 17:24:32.307970 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:32.307929 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:32.311845 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:32.311819 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9458a330-4a73-457d-a605-d7998538c01b-original-pull-secret\") pod \"global-pull-secret-syncer-8cqlg\" (UID: \"9458a330-4a73-457d-a605-d7998538c01b\") " pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:32.449224 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:32.449193 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8cqlg" Apr 17 17:24:32.583254 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:32.583099 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8cqlg"] Apr 17 17:24:32.586061 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:24:32.586029 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9458a330_4a73_457d_a605_d7998538c01b.slice/crio-2f7469568c15d76dc1fc77a5a3431fe1d903251de893fbee0e2d3073f7eba451 WatchSource:0}: Error finding container 2f7469568c15d76dc1fc77a5a3431fe1d903251de893fbee0e2d3073f7eba451: Status 404 returned error can't find the container with id 2f7469568c15d76dc1fc77a5a3431fe1d903251de893fbee0e2d3073f7eba451 Apr 17 17:24:32.847441 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:32.847352 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8cqlg" event={"ID":"9458a330-4a73-457d-a605-d7998538c01b","Type":"ContainerStarted","Data":"2f7469568c15d76dc1fc77a5a3431fe1d903251de893fbee0e2d3073f7eba451"} Apr 17 17:24:35.228175 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:35.228137 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:35.228625 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:35.228192 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:35.228625 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:35.228324 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:35.228625 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:35.228359 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:35.228625 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:35.228393 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert podName:1586f132-dd9c-4636-a7c7-87b1b730dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:43.228371244 +0000 UTC m=+49.199654562 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert") pod "ingress-canary-l6nv9" (UID: "1586f132-dd9c-4636-a7c7-87b1b730dc01") : secret "canary-serving-cert" not found Apr 17 17:24:35.228625 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:35.228430 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls podName:c063b8d8-8182-438f-a272-69a64fcbb153 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:43.228402669 +0000 UTC m=+49.199685989 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls") pod "dns-default-h5vmx" (UID: "c063b8d8-8182-438f-a272-69a64fcbb153") : secret "dns-default-metrics-tls" not found Apr 17 17:24:36.855765 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:36.855733 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8cqlg" event={"ID":"9458a330-4a73-457d-a605-d7998538c01b","Type":"ContainerStarted","Data":"355fda6c9c407b3c225c76f4203e7f6e823af45854cd1110e643d47b77c1d6d9"} Apr 17 17:24:36.873352 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:36.873297 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8cqlg" podStartSLOduration=33.128700495 podStartE2EDuration="36.873281139s" podCreationTimestamp="2026-04-17 17:24:00 +0000 UTC" firstStartedPulling="2026-04-17 17:24:32.587782338 +0000 UTC m=+38.559065671" lastFinishedPulling="2026-04-17 17:24:36.332362982 +0000 UTC m=+42.303646315" observedRunningTime="2026-04-17 17:24:36.872858316 +0000 UTC m=+42.844141656" watchObservedRunningTime="2026-04-17 17:24:36.873281139 +0000 UTC m=+42.844564545" Apr 17 17:24:43.283039 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:43.282993 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:43.283039 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:43.283037 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:43.283510 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:43.283140 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:43.283510 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:43.283152 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:43.283510 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:43.283201 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert podName:1586f132-dd9c-4636-a7c7-87b1b730dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:59.283185342 +0000 UTC m=+65.254468661 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert") pod "ingress-canary-l6nv9" (UID: "1586f132-dd9c-4636-a7c7-87b1b730dc01") : secret "canary-serving-cert" not found Apr 17 17:24:43.283510 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:43.283213 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls podName:c063b8d8-8182-438f-a272-69a64fcbb153 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:59.28320753 +0000 UTC m=+65.254490849 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls") pod "dns-default-h5vmx" (UID: "c063b8d8-8182-438f-a272-69a64fcbb153") : secret "dns-default-metrics-tls" not found Apr 17 17:24:51.822790 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:51.822762 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rjptt" Apr 17 17:24:59.283658 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:59.283603 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:24:59.283658 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:24:59.283663 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:24:59.284165 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:59.283750 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:59.284165 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:59.283812 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls podName:c063b8d8-8182-438f-a272-69a64fcbb153 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:31.283797067 +0000 UTC m=+97.255080387 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls") pod "dns-default-h5vmx" (UID: "c063b8d8-8182-438f-a272-69a64fcbb153") : secret "dns-default-metrics-tls" not found Apr 17 17:24:59.284165 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:59.283758 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:59.284165 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:24:59.283883 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert podName:1586f132-dd9c-4636-a7c7-87b1b730dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:31.283870191 +0000 UTC m=+97.255153509 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert") pod "ingress-canary-l6nv9" (UID: "1586f132-dd9c-4636-a7c7-87b1b730dc01") : secret "canary-serving-cert" not found Apr 17 17:25:00.290406 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:00.290362 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:25:00.290823 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:00.290518 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:25:00.290823 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:00.290631 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs podName:1227f475-d747-4720-ad95-d72a46d6d1fb nodeName:}" failed. No retries permitted until 2026-04-17 17:26:04.290568746 +0000 UTC m=+130.261852065 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs") pod "network-metrics-daemon-knvfd" (UID: "1227f475-d747-4720-ad95-d72a46d6d1fb") : secret "metrics-daemon-secret" not found Apr 17 17:25:03.851549 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:03.851519 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4qnzz" Apr 17 17:25:06.990658 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:06.990622 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r"] Apr 17 17:25:07.044132 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.044093 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r"] Apr 17 17:25:07.044302 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.044228 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r" Apr 17 17:25:07.047839 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.047807 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 17:25:07.047977 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.047850 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 17:25:07.047977 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.047807 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 17:25:07.047977 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.047890 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-n5452\"" Apr 17 17:25:07.048144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.048046 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 17:25:07.136678 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.136642 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/073134c3-0c35-4c37-9325-58f539441c6d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-577b6978c-l6n2r\" (UID: \"073134c3-0c35-4c37-9325-58f539441c6d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r" Apr 17 17:25:07.136678 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.136678 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j84gb\" (UniqueName: \"kubernetes.io/projected/073134c3-0c35-4c37-9325-58f539441c6d-kube-api-access-j84gb\") pod \"managed-serviceaccount-addon-agent-577b6978c-l6n2r\" (UID: \"073134c3-0c35-4c37-9325-58f539441c6d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r" Apr 17 17:25:07.237742 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.237710 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/073134c3-0c35-4c37-9325-58f539441c6d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-577b6978c-l6n2r\" (UID: \"073134c3-0c35-4c37-9325-58f539441c6d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r" Apr 17 17:25:07.237742 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.237742 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j84gb\" (UniqueName: \"kubernetes.io/projected/073134c3-0c35-4c37-9325-58f539441c6d-kube-api-access-j84gb\") pod \"managed-serviceaccount-addon-agent-577b6978c-l6n2r\" (UID: \"073134c3-0c35-4c37-9325-58f539441c6d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r" Apr 17 17:25:07.240977 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.240930 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/073134c3-0c35-4c37-9325-58f539441c6d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-577b6978c-l6n2r\" (UID: \"073134c3-0c35-4c37-9325-58f539441c6d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r" Apr 17 17:25:07.249419 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.249394 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j84gb\" (UniqueName: \"kubernetes.io/projected/073134c3-0c35-4c37-9325-58f539441c6d-kube-api-access-j84gb\") pod \"managed-serviceaccount-addon-agent-577b6978c-l6n2r\" (UID: \"073134c3-0c35-4c37-9325-58f539441c6d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r" Apr 17 17:25:07.369035 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.368998 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r" Apr 17 17:25:07.482551 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.482521 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r"] Apr 17 17:25:07.486865 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:25:07.486832 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod073134c3_0c35_4c37_9325_58f539441c6d.slice/crio-44392e14a34d8330d58c887228cd9bd8ae7e826f95e5b19b9165045954d06f89 WatchSource:0}: Error finding container 44392e14a34d8330d58c887228cd9bd8ae7e826f95e5b19b9165045954d06f89: Status 404 returned error can't find the container with id 44392e14a34d8330d58c887228cd9bd8ae7e826f95e5b19b9165045954d06f89 Apr 17 17:25:07.914884 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:07.914842 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r" event={"ID":"073134c3-0c35-4c37-9325-58f539441c6d","Type":"ContainerStarted","Data":"44392e14a34d8330d58c887228cd9bd8ae7e826f95e5b19b9165045954d06f89"} Apr 17 17:25:10.921922 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:10.921884 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r" event={"ID":"073134c3-0c35-4c37-9325-58f539441c6d","Type":"ContainerStarted","Data":"6f90e3a29419f2416987b4562bbae9ff9cad1f6a52421cf31a7f14f620eca83f"} Apr 17 17:25:10.939385 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:10.939332 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-577b6978c-l6n2r" podStartSLOduration=2.151245932 podStartE2EDuration="4.939317629s" podCreationTimestamp="2026-04-17 17:25:06 +0000 UTC" firstStartedPulling="2026-04-17 17:25:07.488743351 +0000 UTC m=+73.460026675" lastFinishedPulling="2026-04-17 17:25:10.276815039 +0000 UTC m=+76.248098372" observedRunningTime="2026-04-17 17:25:10.938304761 +0000 UTC m=+76.909588101" watchObservedRunningTime="2026-04-17 17:25:10.939317629 +0000 UTC m=+76.910601008" Apr 17 17:25:31.296846 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:31.296793 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:25:31.296846 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:31.296851 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:25:31.297332 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:31.296947 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:31.297332 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:31.296958 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:31.297332 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:31.297006 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert podName:1586f132-dd9c-4636-a7c7-87b1b730dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:35.296993454 +0000 UTC m=+161.268276773 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert") pod "ingress-canary-l6nv9" (UID: "1586f132-dd9c-4636-a7c7-87b1b730dc01") : secret "canary-serving-cert" not found Apr 17 17:25:31.297332 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:31.297038 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls podName:c063b8d8-8182-438f-a272-69a64fcbb153 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:35.297018608 +0000 UTC m=+161.268301943 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls") pod "dns-default-h5vmx" (UID: "c063b8d8-8182-438f-a272-69a64fcbb153") : secret "dns-default-metrics-tls" not found Apr 17 17:25:50.574154 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.574090 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq"] Apr 17 17:25:50.577269 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.577244 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-d5589f9d4-fkjkk"] Apr 17 17:25:50.577407 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.577382 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:25:50.579062 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.579041 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.579966 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.579950 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-m6lvl\"" Apr 17 17:25:50.580455 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.580439 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 17:25:50.580603 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.580574 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:50.580716 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.580702 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:50.584679 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.584660 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:25:50.584769 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.584662 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:25:50.585049 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.585032 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jk79w\"" Apr 17 17:25:50.585132 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.585074 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:25:50.589603 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.589562 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq"] Apr 17 17:25:50.592196 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.592171 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d5589f9d4-fkjkk"] Apr 17 17:25:50.592432 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.592411 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:25:50.631287 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.631253 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6hv4\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-kube-api-access-b6hv4\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.631469 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.631306 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-certificates\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.631469 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.631374 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-image-registry-private-configuration\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.631469 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.631429 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d8pmq\" (UID: \"3f6de31f-775d-42c4-9aa8-91d5f855192b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:25:50.631469 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.631452 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-trusted-ca\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.631469 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.631469 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-installation-pull-secrets\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.631774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.631496 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw7fx\" (UniqueName: \"kubernetes.io/projected/3f6de31f-775d-42c4-9aa8-91d5f855192b-kube-api-access-gw7fx\") pod \"cluster-samples-operator-6dc5bdb6b4-d8pmq\" (UID: \"3f6de31f-775d-42c4-9aa8-91d5f855192b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:25:50.631774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.631512 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-bound-sa-token\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.631774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.631612 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.631774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.631647 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe2dbffa-3af1-4b17-ae47-661d7b154a27-ca-trust-extracted\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.731983 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.731935 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d8pmq\" (UID: \"3f6de31f-775d-42c4-9aa8-91d5f855192b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:25:50.731983 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.731983 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-trusted-ca\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.732207 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.732008 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-installation-pull-secrets\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.732207 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.732039 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gw7fx\" (UniqueName: \"kubernetes.io/projected/3f6de31f-775d-42c4-9aa8-91d5f855192b-kube-api-access-gw7fx\") pod \"cluster-samples-operator-6dc5bdb6b4-d8pmq\" (UID: \"3f6de31f-775d-42c4-9aa8-91d5f855192b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:25:50.732207 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.732057 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-bound-sa-token\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.732207 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.732094 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.732381 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:50.732269 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:25:50.732431 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:50.732382 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls podName:3f6de31f-775d-42c4-9aa8-91d5f855192b nodeName:}" failed. No retries permitted until 2026-04-17 17:25:51.232350962 +0000 UTC m=+117.203636144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d8pmq" (UID: "3f6de31f-775d-42c4-9aa8-91d5f855192b") : secret "samples-operator-tls" not found Apr 17 17:25:50.732431 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:50.732394 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:50.732431 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:50.732419 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5589f9d4-fkjkk: secret "image-registry-tls" not found Apr 17 17:25:50.732431 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.732416 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe2dbffa-3af1-4b17-ae47-661d7b154a27-ca-trust-extracted\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.732651 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.732477 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6hv4\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-kube-api-access-b6hv4\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.732651 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:50.732496 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls podName:fe2dbffa-3af1-4b17-ae47-661d7b154a27 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:51.232474045 +0000 UTC m=+117.203757378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls") pod "image-registry-d5589f9d4-fkjkk" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27") : secret "image-registry-tls" not found Apr 17 17:25:50.732756 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.732691 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-certificates\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.732807 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.732754 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-image-registry-private-configuration\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.732928 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.732907 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe2dbffa-3af1-4b17-ae47-661d7b154a27-ca-trust-extracted\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.733246 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.733214 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-certificates\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.733539 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.733519 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-trusted-ca\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.734555 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.734539 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-installation-pull-secrets\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.734850 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.734833 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-image-registry-private-configuration\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.742800 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.742780 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw7fx\" (UniqueName: \"kubernetes.io/projected/3f6de31f-775d-42c4-9aa8-91d5f855192b-kube-api-access-gw7fx\") pod \"cluster-samples-operator-6dc5bdb6b4-d8pmq\" (UID: \"3f6de31f-775d-42c4-9aa8-91d5f855192b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:25:50.742927 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.742909 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-bound-sa-token\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:50.743142 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:50.743122 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6hv4\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-kube-api-access-b6hv4\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:51.236845 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:51.236787 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d8pmq\" (UID: \"3f6de31f-775d-42c4-9aa8-91d5f855192b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:25:51.236845 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:51.236857 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:51.237073 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:51.236945 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:25:51.237073 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:51.236968 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:51.237073 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:51.236979 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5589f9d4-fkjkk: secret "image-registry-tls" not found Apr 17 17:25:51.237073 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:51.237013 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls podName:3f6de31f-775d-42c4-9aa8-91d5f855192b nodeName:}" failed. No retries permitted until 2026-04-17 17:25:52.23699836 +0000 UTC m=+118.208281679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d8pmq" (UID: "3f6de31f-775d-42c4-9aa8-91d5f855192b") : secret "samples-operator-tls" not found Apr 17 17:25:51.237073 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:51.237028 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls podName:fe2dbffa-3af1-4b17-ae47-661d7b154a27 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:52.237021352 +0000 UTC m=+118.208304670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls") pod "image-registry-d5589f9d4-fkjkk" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27") : secret "image-registry-tls" not found Apr 17 17:25:52.244956 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:52.244909 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:52.245353 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:52.245075 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:52.245353 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:52.245090 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d8pmq\" (UID: \"3f6de31f-775d-42c4-9aa8-91d5f855192b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:25:52.245353 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:52.245099 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5589f9d4-fkjkk: secret "image-registry-tls" not found Apr 17 17:25:52.245353 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:52.245156 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:25:52.245353 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:52.245212 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls podName:3f6de31f-775d-42c4-9aa8-91d5f855192b nodeName:}" failed. No retries permitted until 2026-04-17 17:25:54.245194695 +0000 UTC m=+120.216478016 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d8pmq" (UID: "3f6de31f-775d-42c4-9aa8-91d5f855192b") : secret "samples-operator-tls" not found Apr 17 17:25:52.245353 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:52.245225 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls podName:fe2dbffa-3af1-4b17-ae47-661d7b154a27 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:54.245218542 +0000 UTC m=+120.216501861 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls") pod "image-registry-d5589f9d4-fkjkk" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27") : secret "image-registry-tls" not found Apr 17 17:25:54.259539 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:54.259489 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d8pmq\" (UID: \"3f6de31f-775d-42c4-9aa8-91d5f855192b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:25:54.259924 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:54.259554 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:54.259924 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:54.259686 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:54.259924 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:54.259690 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:25:54.259924 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:54.259768 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls podName:3f6de31f-775d-42c4-9aa8-91d5f855192b nodeName:}" failed. No retries permitted until 2026-04-17 17:25:58.259751562 +0000 UTC m=+124.231034881 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d8pmq" (UID: "3f6de31f-775d-42c4-9aa8-91d5f855192b") : secret "samples-operator-tls" not found Apr 17 17:25:54.259924 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:54.259697 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5589f9d4-fkjkk: secret "image-registry-tls" not found Apr 17 17:25:54.259924 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:54.259841 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls podName:fe2dbffa-3af1-4b17-ae47-661d7b154a27 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:58.259828554 +0000 UTC m=+124.231111873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls") pod "image-registry-d5589f9d4-fkjkk" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27") : secret "image-registry-tls" not found Apr 17 17:25:55.954120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:55.954087 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l"] Apr 17 17:25:55.956552 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:55.956535 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l" Apr 17 17:25:55.959099 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:55.959080 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 17:25:55.959995 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:55.959978 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:55.960068 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:55.960015 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-4qbq2\"" Apr 17 17:25:55.967280 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:55.967260 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l"] Apr 17 17:25:56.075479 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:56.075444 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5cgr\" (UniqueName: \"kubernetes.io/projected/e4d9e99e-ca09-49e3-a1d6-e5beebfc6147-kube-api-access-n5cgr\") pod \"migrator-74bb7799d9-5xd9l\" (UID: \"e4d9e99e-ca09-49e3-a1d6-e5beebfc6147\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l" Apr 17 17:25:56.175996 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:56.175966 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5cgr\" (UniqueName: \"kubernetes.io/projected/e4d9e99e-ca09-49e3-a1d6-e5beebfc6147-kube-api-access-n5cgr\") pod \"migrator-74bb7799d9-5xd9l\" (UID: \"e4d9e99e-ca09-49e3-a1d6-e5beebfc6147\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l" Apr 17 17:25:56.183984 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:56.183957 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5cgr\" (UniqueName: \"kubernetes.io/projected/e4d9e99e-ca09-49e3-a1d6-e5beebfc6147-kube-api-access-n5cgr\") pod \"migrator-74bb7799d9-5xd9l\" (UID: \"e4d9e99e-ca09-49e3-a1d6-e5beebfc6147\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l" Apr 17 17:25:56.265200 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:56.265107 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l" Apr 17 17:25:56.377956 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:56.377927 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l"] Apr 17 17:25:56.381233 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:25:56.381197 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4d9e99e_ca09_49e3_a1d6_e5beebfc6147.slice/crio-33f7cff9c6d32331614f4151456951e74e1d07cbad179c908ead30cab9ccdb78 WatchSource:0}: Error finding container 33f7cff9c6d32331614f4151456951e74e1d07cbad179c908ead30cab9ccdb78: Status 404 returned error can't find the container with id 33f7cff9c6d32331614f4151456951e74e1d07cbad179c908ead30cab9ccdb78 Apr 17 17:25:57.013871 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:57.013831 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l" event={"ID":"e4d9e99e-ca09-49e3-a1d6-e5beebfc6147","Type":"ContainerStarted","Data":"33f7cff9c6d32331614f4151456951e74e1d07cbad179c908ead30cab9ccdb78"} Apr 17 17:25:58.017593 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:58.017550 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l" event={"ID":"e4d9e99e-ca09-49e3-a1d6-e5beebfc6147","Type":"ContainerStarted","Data":"a3bec9d0e5695e6fb2e949c8534de9869604c352a0c1ea9b0dee16ee84ce6c62"} Apr 17 17:25:58.018074 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:58.017608 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l" event={"ID":"e4d9e99e-ca09-49e3-a1d6-e5beebfc6147","Type":"ContainerStarted","Data":"b3124d45a8da19e36066c108d76b58e70ad8e33296711f35a161c3ed003efbed"} Apr 17 17:25:58.037062 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:58.037010 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5xd9l" podStartSLOduration=2.086185451 podStartE2EDuration="3.036996483s" podCreationTimestamp="2026-04-17 17:25:55 +0000 UTC" firstStartedPulling="2026-04-17 17:25:56.383065974 +0000 UTC m=+122.354349294" lastFinishedPulling="2026-04-17 17:25:57.333876803 +0000 UTC m=+123.305160326" observedRunningTime="2026-04-17 17:25:58.03544375 +0000 UTC m=+124.006727095" watchObservedRunningTime="2026-04-17 17:25:58.036996483 +0000 UTC m=+124.008279848" Apr 17 17:25:58.292841 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:58.292760 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:25:58.292957 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:58.292899 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d8pmq\" (UID: \"3f6de31f-775d-42c4-9aa8-91d5f855192b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:25:58.292957 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:58.292902 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:58.292957 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:58.292921 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5589f9d4-fkjkk: secret "image-registry-tls" not found Apr 17 17:25:58.293055 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:58.292969 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls podName:fe2dbffa-3af1-4b17-ae47-661d7b154a27 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:06.292953804 +0000 UTC m=+132.264237123 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls") pod "image-registry-d5589f9d4-fkjkk" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27") : secret "image-registry-tls" not found Apr 17 17:25:58.293055 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:58.292978 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:25:58.293055 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:25:58.293017 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls podName:3f6de31f-775d-42c4-9aa8-91d5f855192b nodeName:}" failed. No retries permitted until 2026-04-17 17:26:06.293006065 +0000 UTC m=+132.264289384 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d8pmq" (UID: "3f6de31f-775d-42c4-9aa8-91d5f855192b") : secret "samples-operator-tls" not found Apr 17 17:25:58.794321 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:58.794291 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pch4m_23dab589-f077-4e94-93bc-392122228de4/dns-node-resolver/0.log" Apr 17 17:25:59.036512 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.036480 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-cgpx9"] Apr 17 17:25:59.038432 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.038417 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-cgpx9" Apr 17 17:25:59.041893 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.041867 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 17:25:59.042028 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.041916 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 17:25:59.042110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.042097 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 17:25:59.042958 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.042942 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-h8zpn\"" Apr 17 17:25:59.043075 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.043061 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 17:25:59.051412 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.051385 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-cgpx9"] Apr 17 17:25:59.098917 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.098874 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1a0a173b-7096-4f11-aba1-cf86ba7c1eb1-signing-key\") pod \"service-ca-865cb79987-cgpx9\" (UID: \"1a0a173b-7096-4f11-aba1-cf86ba7c1eb1\") " pod="openshift-service-ca/service-ca-865cb79987-cgpx9" Apr 17 17:25:59.099087 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.098926 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1a0a173b-7096-4f11-aba1-cf86ba7c1eb1-signing-cabundle\") pod \"service-ca-865cb79987-cgpx9\" (UID: \"1a0a173b-7096-4f11-aba1-cf86ba7c1eb1\") " pod="openshift-service-ca/service-ca-865cb79987-cgpx9" Apr 17 17:25:59.099087 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.099014 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sn29\" (UniqueName: \"kubernetes.io/projected/1a0a173b-7096-4f11-aba1-cf86ba7c1eb1-kube-api-access-4sn29\") pod \"service-ca-865cb79987-cgpx9\" (UID: \"1a0a173b-7096-4f11-aba1-cf86ba7c1eb1\") " pod="openshift-service-ca/service-ca-865cb79987-cgpx9" Apr 17 17:25:59.199912 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.199877 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sn29\" (UniqueName: \"kubernetes.io/projected/1a0a173b-7096-4f11-aba1-cf86ba7c1eb1-kube-api-access-4sn29\") pod \"service-ca-865cb79987-cgpx9\" (UID: \"1a0a173b-7096-4f11-aba1-cf86ba7c1eb1\") " pod="openshift-service-ca/service-ca-865cb79987-cgpx9" Apr 17 17:25:59.200034 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.199942 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1a0a173b-7096-4f11-aba1-cf86ba7c1eb1-signing-key\") pod \"service-ca-865cb79987-cgpx9\" (UID: \"1a0a173b-7096-4f11-aba1-cf86ba7c1eb1\") " pod="openshift-service-ca/service-ca-865cb79987-cgpx9" Apr 17 17:25:59.200034 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.199968 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1a0a173b-7096-4f11-aba1-cf86ba7c1eb1-signing-cabundle\") pod \"service-ca-865cb79987-cgpx9\" (UID: \"1a0a173b-7096-4f11-aba1-cf86ba7c1eb1\") " pod="openshift-service-ca/service-ca-865cb79987-cgpx9" Apr 17 17:25:59.200603 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.200560 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1a0a173b-7096-4f11-aba1-cf86ba7c1eb1-signing-cabundle\") pod \"service-ca-865cb79987-cgpx9\" (UID: \"1a0a173b-7096-4f11-aba1-cf86ba7c1eb1\") " pod="openshift-service-ca/service-ca-865cb79987-cgpx9" Apr 17 17:25:59.202279 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.202253 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1a0a173b-7096-4f11-aba1-cf86ba7c1eb1-signing-key\") pod \"service-ca-865cb79987-cgpx9\" (UID: \"1a0a173b-7096-4f11-aba1-cf86ba7c1eb1\") " pod="openshift-service-ca/service-ca-865cb79987-cgpx9" Apr 17 17:25:59.209256 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.209233 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sn29\" (UniqueName: \"kubernetes.io/projected/1a0a173b-7096-4f11-aba1-cf86ba7c1eb1-kube-api-access-4sn29\") pod \"service-ca-865cb79987-cgpx9\" (UID: \"1a0a173b-7096-4f11-aba1-cf86ba7c1eb1\") " pod="openshift-service-ca/service-ca-865cb79987-cgpx9" Apr 17 17:25:59.347923 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.347892 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-cgpx9" Apr 17 17:25:59.464342 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.464310 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-cgpx9"] Apr 17 17:25:59.468331 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:25:59.468303 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a0a173b_7096_4f11_aba1_cf86ba7c1eb1.slice/crio-cb38077bee2a8ed19a94801d6c19d88bf9958bee6c30d7ee34a1a972b1d8b35b WatchSource:0}: Error finding container cb38077bee2a8ed19a94801d6c19d88bf9958bee6c30d7ee34a1a972b1d8b35b: Status 404 returned error can't find the container with id cb38077bee2a8ed19a94801d6c19d88bf9958bee6c30d7ee34a1a972b1d8b35b Apr 17 17:25:59.594102 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:25:59.594072 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8db7c_9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c/node-ca/0.log" Apr 17 17:26:00.022851 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:00.022814 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-cgpx9" event={"ID":"1a0a173b-7096-4f11-aba1-cf86ba7c1eb1","Type":"ContainerStarted","Data":"cb38077bee2a8ed19a94801d6c19d88bf9958bee6c30d7ee34a1a972b1d8b35b"} Apr 17 17:26:00.994108 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:00.994083 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-5xd9l_e4d9e99e-ca09-49e3-a1d6-e5beebfc6147/migrator/0.log" Apr 17 17:26:01.201336 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:01.201306 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-5xd9l_e4d9e99e-ca09-49e3-a1d6-e5beebfc6147/graceful-termination/0.log" Apr 17 17:26:02.028233 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:02.028197 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-cgpx9" event={"ID":"1a0a173b-7096-4f11-aba1-cf86ba7c1eb1","Type":"ContainerStarted","Data":"871a33cb75df14279eed43681042924b20f8b5aa496d1d1736453a6028ec25a2"} Apr 17 17:26:02.045421 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:02.045374 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-cgpx9" podStartSLOduration=1.533894031 podStartE2EDuration="3.045358816s" podCreationTimestamp="2026-04-17 17:25:59 +0000 UTC" firstStartedPulling="2026-04-17 17:25:59.470216052 +0000 UTC m=+125.441499372" lastFinishedPulling="2026-04-17 17:26:00.981680817 +0000 UTC m=+126.952964157" observedRunningTime="2026-04-17 17:26:02.044665765 +0000 UTC m=+128.015949107" watchObservedRunningTime="2026-04-17 17:26:02.045358816 +0000 UTC m=+128.016642157" Apr 17 17:26:04.341145 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:04.341111 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:26:04.341543 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:04.341264 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:26:04.341543 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:04.341334 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs podName:1227f475-d747-4720-ad95-d72a46d6d1fb nodeName:}" failed. No retries permitted until 2026-04-17 17:28:06.341315215 +0000 UTC m=+252.312598536 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs") pod "network-metrics-daemon-knvfd" (UID: "1227f475-d747-4720-ad95-d72a46d6d1fb") : secret "metrics-daemon-secret" not found Apr 17 17:26:06.357728 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:06.357669 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls\") pod \"image-registry-d5589f9d4-fkjkk\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:26:06.358200 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:06.357786 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d8pmq\" (UID: \"3f6de31f-775d-42c4-9aa8-91d5f855192b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:26:06.358200 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:06.357820 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:26:06.358200 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:06.357841 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5589f9d4-fkjkk: secret "image-registry-tls" not found Apr 17 17:26:06.358200 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:06.357894 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls podName:fe2dbffa-3af1-4b17-ae47-661d7b154a27 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:22.357880289 +0000 UTC m=+148.329163608 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls") pod "image-registry-d5589f9d4-fkjkk" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27") : secret "image-registry-tls" not found Apr 17 17:26:06.360083 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:06.360063 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f6de31f-775d-42c4-9aa8-91d5f855192b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d8pmq\" (UID: \"3f6de31f-775d-42c4-9aa8-91d5f855192b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:26:06.490331 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:06.490296 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-m6lvl\"" Apr 17 17:26:06.498673 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:06.498649 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" Apr 17 17:26:06.616804 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:06.616730 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq"] Apr 17 17:26:07.040237 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:07.040142 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" event={"ID":"3f6de31f-775d-42c4-9aa8-91d5f855192b","Type":"ContainerStarted","Data":"0a488a88fcc1453242cac39d24c0ebf9d2b3600f7bc98a33373ab3f77ec2b965"} Apr 17 17:26:09.047078 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:09.047036 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" event={"ID":"3f6de31f-775d-42c4-9aa8-91d5f855192b","Type":"ContainerStarted","Data":"4b9f6cccf803a2f30878ffe7ed48f3270050cd6aed3f67f8a81e7776a021d3a9"} Apr 17 17:26:09.047078 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:09.047075 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" event={"ID":"3f6de31f-775d-42c4-9aa8-91d5f855192b","Type":"ContainerStarted","Data":"d5f3d092d2fdd6d44d9213fbcdfb1e161bb8047542f4a3505269aca04d66b9d1"} Apr 17 17:26:09.065240 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:09.065183 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d8pmq" podStartSLOduration=17.436480552 podStartE2EDuration="19.065167689s" podCreationTimestamp="2026-04-17 17:25:50 +0000 UTC" firstStartedPulling="2026-04-17 17:26:06.659951883 +0000 UTC m=+132.631235201" lastFinishedPulling="2026-04-17 17:26:08.288639019 +0000 UTC m=+134.259922338" observedRunningTime="2026-04-17 17:26:09.063398638 +0000 UTC m=+135.034681980" watchObservedRunningTime="2026-04-17 17:26:09.065167689 +0000 UTC m=+135.036451007" Apr 17 17:26:20.553044 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.553009 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d5589f9d4-fkjkk"] Apr 17 17:26:20.553408 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:20.553198 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" podUID="fe2dbffa-3af1-4b17-ae47-661d7b154a27" Apr 17 17:26:20.660964 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.660933 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-d966q"] Apr 17 17:26:20.664019 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.664001 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.666611 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.666565 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:26:20.666724 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.666679 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:26:20.666790 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.666725 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:26:20.666844 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.666791 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fcb89\"" Apr 17 17:26:20.667510 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.667496 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:26:20.674543 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.674525 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d966q"] Apr 17 17:26:20.766873 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.766834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/70f0e0b4-2719-4e2b-95f2-d115223c13dd-crio-socket\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.767051 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.766878 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/70f0e0b4-2719-4e2b-95f2-d115223c13dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.767107 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.767051 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/70f0e0b4-2719-4e2b-95f2-d115223c13dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.767107 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.767083 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/70f0e0b4-2719-4e2b-95f2-d115223c13dd-data-volume\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.767188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.767146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2dzg\" (UniqueName: \"kubernetes.io/projected/70f0e0b4-2719-4e2b-95f2-d115223c13dd-kube-api-access-j2dzg\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.868481 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.868449 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/70f0e0b4-2719-4e2b-95f2-d115223c13dd-crio-socket\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.868660 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.868491 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/70f0e0b4-2719-4e2b-95f2-d115223c13dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.868660 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.868514 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/70f0e0b4-2719-4e2b-95f2-d115223c13dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.868660 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.868606 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/70f0e0b4-2719-4e2b-95f2-d115223c13dd-crio-socket\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.868660 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.868632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/70f0e0b4-2719-4e2b-95f2-d115223c13dd-data-volume\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.868814 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.868716 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2dzg\" (UniqueName: \"kubernetes.io/projected/70f0e0b4-2719-4e2b-95f2-d115223c13dd-kube-api-access-j2dzg\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.868965 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.868944 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/70f0e0b4-2719-4e2b-95f2-d115223c13dd-data-volume\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.869083 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.869064 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/70f0e0b4-2719-4e2b-95f2-d115223c13dd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.870789 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.870770 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/70f0e0b4-2719-4e2b-95f2-d115223c13dd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.877663 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.877644 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2dzg\" (UniqueName: \"kubernetes.io/projected/70f0e0b4-2719-4e2b-95f2-d115223c13dd-kube-api-access-j2dzg\") pod \"insights-runtime-extractor-d966q\" (UID: \"70f0e0b4-2719-4e2b-95f2-d115223c13dd\") " pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:20.972978 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:20.972944 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d966q" Apr 17 17:26:21.073236 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.073209 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:26:21.077479 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.077456 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:26:21.083291 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.083270 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d966q"] Apr 17 17:26:21.087108 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:21.087085 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70f0e0b4_2719_4e2b_95f2_d115223c13dd.slice/crio-fb9ac0b66c515d33730b6ccdb42d3903e871080ff73cc5b8157f6e55e39d2e23 WatchSource:0}: Error finding container fb9ac0b66c515d33730b6ccdb42d3903e871080ff73cc5b8157f6e55e39d2e23: Status 404 returned error can't find the container with id fb9ac0b66c515d33730b6ccdb42d3903e871080ff73cc5b8157f6e55e39d2e23 Apr 17 17:26:21.171189 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.171151 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe2dbffa-3af1-4b17-ae47-661d7b154a27-ca-trust-extracted\") pod \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " Apr 17 17:26:21.171346 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.171205 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-trusted-ca\") pod \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " Apr 17 17:26:21.171346 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.171236 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6hv4\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-kube-api-access-b6hv4\") pod \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " Apr 17 17:26:21.171346 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.171272 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-image-registry-private-configuration\") pod \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " Apr 17 17:26:21.171346 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.171305 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-installation-pull-secrets\") pod \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " Apr 17 17:26:21.171527 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.171352 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-certificates\") pod \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " Apr 17 17:26:21.171527 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.171384 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-bound-sa-token\") pod \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\" (UID: \"fe2dbffa-3af1-4b17-ae47-661d7b154a27\") " Apr 17 17:26:21.171527 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.171453 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe2dbffa-3af1-4b17-ae47-661d7b154a27-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fe2dbffa-3af1-4b17-ae47-661d7b154a27" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:26:21.171710 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.171684 2580 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe2dbffa-3af1-4b17-ae47-661d7b154a27-ca-trust-extracted\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:21.171763 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.171735 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fe2dbffa-3af1-4b17-ae47-661d7b154a27" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:21.171899 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.171871 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fe2dbffa-3af1-4b17-ae47-661d7b154a27" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:21.173498 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.173474 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "fe2dbffa-3af1-4b17-ae47-661d7b154a27" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:21.173594 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.173477 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-kube-api-access-b6hv4" (OuterVolumeSpecName: "kube-api-access-b6hv4") pod "fe2dbffa-3af1-4b17-ae47-661d7b154a27" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27"). InnerVolumeSpecName "kube-api-access-b6hv4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:21.173755 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.173720 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fe2dbffa-3af1-4b17-ae47-661d7b154a27" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:21.173755 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.173720 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fe2dbffa-3af1-4b17-ae47-661d7b154a27" (UID: "fe2dbffa-3af1-4b17-ae47-661d7b154a27"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:21.272798 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.272765 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-trusted-ca\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:21.272798 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.272792 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6hv4\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-kube-api-access-b6hv4\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:21.272798 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.272803 2580 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-image-registry-private-configuration\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:21.272996 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.272815 2580 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe2dbffa-3af1-4b17-ae47-661d7b154a27-installation-pull-secrets\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:21.272996 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.272825 2580 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-certificates\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:21.272996 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:21.272835 2580 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-bound-sa-token\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:22.077525 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:22.077464 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5589f9d4-fkjkk" Apr 17 17:26:22.077525 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:22.077463 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d966q" event={"ID":"70f0e0b4-2719-4e2b-95f2-d115223c13dd","Type":"ContainerStarted","Data":"9a44eb64d373d7bdfe7562cacca2fb499e54fac8ce9e521e5666a3113d3c743b"} Apr 17 17:26:22.077525 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:22.077500 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d966q" event={"ID":"70f0e0b4-2719-4e2b-95f2-d115223c13dd","Type":"ContainerStarted","Data":"3920cf46d91397974aab4242b37fb9d9e29320d5f68707dc945732e3c86575ab"} Apr 17 17:26:22.077525 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:22.077514 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d966q" event={"ID":"70f0e0b4-2719-4e2b-95f2-d115223c13dd","Type":"ContainerStarted","Data":"fb9ac0b66c515d33730b6ccdb42d3903e871080ff73cc5b8157f6e55e39d2e23"} Apr 17 17:26:22.113633 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:22.113595 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d5589f9d4-fkjkk"] Apr 17 17:26:22.115813 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:22.115790 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-d5589f9d4-fkjkk"] Apr 17 17:26:22.279064 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:22.279012 2580 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe2dbffa-3af1-4b17-ae47-661d7b154a27-registry-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:22.639960 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:22.639930 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe2dbffa-3af1-4b17-ae47-661d7b154a27" path="/var/lib/kubelet/pods/fe2dbffa-3af1-4b17-ae47-661d7b154a27/volumes" Apr 17 17:26:24.083399 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:24.083364 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d966q" event={"ID":"70f0e0b4-2719-4e2b-95f2-d115223c13dd","Type":"ContainerStarted","Data":"f92b224242395e1dfe85855c26e8e6ced5fdb4fd3e5a5ba9518da1d7fc3532ff"} Apr 17 17:26:24.102420 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:24.102373 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-d966q" podStartSLOduration=2.14227208 podStartE2EDuration="4.102360724s" podCreationTimestamp="2026-04-17 17:26:20 +0000 UTC" firstStartedPulling="2026-04-17 17:26:21.136056486 +0000 UTC m=+147.107339813" lastFinishedPulling="2026-04-17 17:26:23.096145135 +0000 UTC m=+149.067428457" observedRunningTime="2026-04-17 17:26:24.100930071 +0000 UTC m=+150.072213411" watchObservedRunningTime="2026-04-17 17:26:24.102360724 +0000 UTC m=+150.073644065" Apr 17 17:26:25.543805 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:25.543772 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c"] Apr 17 17:26:25.546864 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:25.546843 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c" Apr 17 17:26:25.549280 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:25.549260 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 17:26:25.549394 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:25.549294 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-fk7kv\"" Apr 17 17:26:25.558243 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:25.558220 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c"] Apr 17 17:26:25.706915 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:25.706877 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f527a836-fa7a-4cf4-8ad3-586bee36a45b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qst7c\" (UID: \"f527a836-fa7a-4cf4-8ad3-586bee36a45b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c" Apr 17 17:26:25.807948 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:25.807877 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f527a836-fa7a-4cf4-8ad3-586bee36a45b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qst7c\" (UID: \"f527a836-fa7a-4cf4-8ad3-586bee36a45b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c" Apr 17 17:26:25.810189 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:25.810162 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f527a836-fa7a-4cf4-8ad3-586bee36a45b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qst7c\" (UID: \"f527a836-fa7a-4cf4-8ad3-586bee36a45b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c" Apr 17 17:26:25.855747 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:25.855707 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c" Apr 17 17:26:25.970051 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:25.970017 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c"] Apr 17 17:26:25.973726 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:25.973699 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf527a836_fa7a_4cf4_8ad3_586bee36a45b.slice/crio-0e7fc65e49b6d01468d27771857784fe201880002c06ae763435510571fff709 WatchSource:0}: Error finding container 0e7fc65e49b6d01468d27771857784fe201880002c06ae763435510571fff709: Status 404 returned error can't find the container with id 0e7fc65e49b6d01468d27771857784fe201880002c06ae763435510571fff709 Apr 17 17:26:26.094677 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:26.094640 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c" event={"ID":"f527a836-fa7a-4cf4-8ad3-586bee36a45b","Type":"ContainerStarted","Data":"0e7fc65e49b6d01468d27771857784fe201880002c06ae763435510571fff709"} Apr 17 17:26:27.994773 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:27.994741 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85b564b9bd-ljx2k"] Apr 17 17:26:27.999200 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:27.999179 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.001797 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.001774 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:26:28.001897 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.001774 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:26:28.001897 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.001779 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:26:28.002833 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.002808 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:26:28.002833 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.002816 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:26:28.002993 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.002835 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:26:28.002993 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.002870 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:26:28.002993 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.002881 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7hcbl\"" Apr 17 17:26:28.006631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.006608 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85b564b9bd-ljx2k"] Apr 17 17:26:28.100766 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.100724 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c" event={"ID":"f527a836-fa7a-4cf4-8ad3-586bee36a45b","Type":"ContainerStarted","Data":"2957a0a9d3f1cbfb435733390185b2f0fda12308afdb7f815fb251906ed07b86"} Apr 17 17:26:28.100941 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.100923 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c" Apr 17 17:26:28.106313 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.106287 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c" Apr 17 17:26:28.117010 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.116964 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qst7c" podStartSLOduration=2.048329121 podStartE2EDuration="3.116951895s" podCreationTimestamp="2026-04-17 17:26:25 +0000 UTC" firstStartedPulling="2026-04-17 17:26:25.975665146 +0000 UTC m=+151.946948466" lastFinishedPulling="2026-04-17 17:26:27.044287917 +0000 UTC m=+153.015571240" observedRunningTime="2026-04-17 17:26:28.116008365 +0000 UTC m=+154.087291718" watchObservedRunningTime="2026-04-17 17:26:28.116951895 +0000 UTC m=+154.088235233" Apr 17 17:26:28.123968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.123942 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6chr\" (UniqueName: \"kubernetes.io/projected/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-kube-api-access-c6chr\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.124094 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.123985 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-oauth-config\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.124094 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.124001 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-service-ca\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.124094 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.124047 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-oauth-serving-cert\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.124188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.124111 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-config\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.124188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.124151 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-serving-cert\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.224729 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.224688 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-config\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.224729 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.224737 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-serving-cert\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.224927 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.224884 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6chr\" (UniqueName: \"kubernetes.io/projected/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-kube-api-access-c6chr\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.224968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.224934 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-oauth-config\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.225079 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.225061 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-service-ca\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.225167 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.225100 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-oauth-serving-cert\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.225767 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.225372 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-config\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.225916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.225817 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-oauth-serving-cert\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.226288 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.226267 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-service-ca\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.227117 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.227098 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-oauth-config\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.227284 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.227267 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-serving-cert\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.233666 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.233647 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6chr\" (UniqueName: \"kubernetes.io/projected/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-kube-api-access-c6chr\") pod \"console-85b564b9bd-ljx2k\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.307835 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.307746 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:28.423726 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:28.423683 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85b564b9bd-ljx2k"] Apr 17 17:26:28.427495 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:28.427459 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78f7f76_a27f_47b0_8ede_0c1d18d956ef.slice/crio-c0bf91aee6872cd000b43b94eeb6748755e6a797e00ed428e0ce3d042c68e1fa WatchSource:0}: Error finding container c0bf91aee6872cd000b43b94eeb6748755e6a797e00ed428e0ce3d042c68e1fa: Status 404 returned error can't find the container with id c0bf91aee6872cd000b43b94eeb6748755e6a797e00ed428e0ce3d042c68e1fa Apr 17 17:26:29.104678 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:29.104645 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b564b9bd-ljx2k" event={"ID":"d78f7f76-a27f-47b0-8ede-0c1d18d956ef","Type":"ContainerStarted","Data":"c0bf91aee6872cd000b43b94eeb6748755e6a797e00ed428e0ce3d042c68e1fa"} Apr 17 17:26:30.426020 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:30.425977 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-h5vmx" podUID="c063b8d8-8182-438f-a272-69a64fcbb153" Apr 17 17:26:30.430126 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:30.430096 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-l6nv9" podUID="1586f132-dd9c-4636-a7c7-87b1b730dc01" Apr 17 17:26:30.660701 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:30.660661 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-knvfd" podUID="1227f475-d747-4720-ad95-d72a46d6d1fb" Apr 17 17:26:31.111927 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:31.111896 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:26:31.112099 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:31.112059 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h5vmx" Apr 17 17:26:33.398914 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.398879 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5865f69957-bxscz"] Apr 17 17:26:33.401588 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.401560 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.410606 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.410570 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:26:33.413286 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.413263 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5865f69957-bxscz"] Apr 17 17:26:33.460886 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.460851 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmzp\" (UniqueName: \"kubernetes.io/projected/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-kube-api-access-swmzp\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.460886 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.460888 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-trusted-ca-bundle\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.461087 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.460927 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-oauth-config\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.461087 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.460949 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-oauth-serving-cert\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.461087 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.460975 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-serving-cert\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.461087 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.460997 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-config\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.461087 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.461074 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-service-ca\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.561889 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.561858 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-oauth-config\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.561983 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.561907 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-oauth-serving-cert\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.561983 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.561941 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-serving-cert\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.561983 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.561960 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-config\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.562086 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.561987 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-service-ca\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.562232 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.562207 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swmzp\" (UniqueName: \"kubernetes.io/projected/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-kube-api-access-swmzp\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.562313 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.562250 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-trusted-ca-bundle\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.562843 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.562818 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-oauth-serving-cert\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.562843 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.562835 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-config\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.562979 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.562839 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-service-ca\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.563161 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.563144 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-trusted-ca-bundle\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.564340 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.564317 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-oauth-config\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.564459 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.564442 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-serving-cert\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.570396 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.570378 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmzp\" (UniqueName: \"kubernetes.io/projected/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-kube-api-access-swmzp\") pod \"console-5865f69957-bxscz\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.710858 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.710740 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:33.826293 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:33.826257 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5865f69957-bxscz"] Apr 17 17:26:33.829257 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:33.829228 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1dfc111_59e4_4a1c_8519_6e9d5f94bf6c.slice/crio-0cd14c475bf68a2c1b519fc5565631b0f41ea42e0785e3fb1750826a4063b22c WatchSource:0}: Error finding container 0cd14c475bf68a2c1b519fc5565631b0f41ea42e0785e3fb1750826a4063b22c: Status 404 returned error can't find the container with id 0cd14c475bf68a2c1b519fc5565631b0f41ea42e0785e3fb1750826a4063b22c Apr 17 17:26:34.119189 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:34.119149 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5865f69957-bxscz" event={"ID":"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c","Type":"ContainerStarted","Data":"0cd14c475bf68a2c1b519fc5565631b0f41ea42e0785e3fb1750826a4063b22c"} Apr 17 17:26:35.376059 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:35.376009 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:26:35.376059 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:35.376068 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:26:35.378469 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:35.378450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1586f132-dd9c-4636-a7c7-87b1b730dc01-cert\") pod \"ingress-canary-l6nv9\" (UID: \"1586f132-dd9c-4636-a7c7-87b1b730dc01\") " pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:26:35.378515 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:35.378470 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c063b8d8-8182-438f-a272-69a64fcbb153-metrics-tls\") pod \"dns-default-h5vmx\" (UID: \"c063b8d8-8182-438f-a272-69a64fcbb153\") " pod="openshift-dns/dns-default-h5vmx" Apr 17 17:26:35.615392 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:35.615351 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hd29f\"" Apr 17 17:26:35.618398 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:35.615693 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qbw75\"" Apr 17 17:26:35.622308 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:35.622286 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l6nv9" Apr 17 17:26:35.622554 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:35.622537 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h5vmx" Apr 17 17:26:35.750027 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:35.749998 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h5vmx"] Apr 17 17:26:35.753445 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:35.753415 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc063b8d8_8182_438f_a272_69a64fcbb153.slice/crio-7f31e6ab74f5d2014c8c33525ab0f908ac8d193e8c163e3ca074f3edba295f03 WatchSource:0}: Error finding container 7f31e6ab74f5d2014c8c33525ab0f908ac8d193e8c163e3ca074f3edba295f03: Status 404 returned error can't find the container with id 7f31e6ab74f5d2014c8c33525ab0f908ac8d193e8c163e3ca074f3edba295f03 Apr 17 17:26:35.762496 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:35.762473 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l6nv9"] Apr 17 17:26:35.767141 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:35.767111 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1586f132_dd9c_4636_a7c7_87b1b730dc01.slice/crio-a42640dc095da49cc5e2816bced8baddca3c37ed00e520cb829832bc523b0d77 WatchSource:0}: Error finding container a42640dc095da49cc5e2816bced8baddca3c37ed00e520cb829832bc523b0d77: Status 404 returned error can't find the container with id a42640dc095da49cc5e2816bced8baddca3c37ed00e520cb829832bc523b0d77 Apr 17 17:26:36.125516 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:36.125478 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h5vmx" event={"ID":"c063b8d8-8182-438f-a272-69a64fcbb153","Type":"ContainerStarted","Data":"7f31e6ab74f5d2014c8c33525ab0f908ac8d193e8c163e3ca074f3edba295f03"} Apr 17 17:26:36.126388 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:36.126367 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l6nv9" event={"ID":"1586f132-dd9c-4636-a7c7-87b1b730dc01","Type":"ContainerStarted","Data":"a42640dc095da49cc5e2816bced8baddca3c37ed00e520cb829832bc523b0d77"} Apr 17 17:26:38.132822 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:38.132789 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l6nv9" event={"ID":"1586f132-dd9c-4636-a7c7-87b1b730dc01","Type":"ContainerStarted","Data":"781758148dbdf95ad6a78785a7d4a9f423f4a37c49cf7304471cba49dddecbf7"} Apr 17 17:26:38.150200 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:38.150151 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-l6nv9" podStartSLOduration=129.662626051 podStartE2EDuration="2m11.150135682s" podCreationTimestamp="2026-04-17 17:24:27 +0000 UTC" firstStartedPulling="2026-04-17 17:26:35.768967556 +0000 UTC m=+161.740250876" lastFinishedPulling="2026-04-17 17:26:37.256477174 +0000 UTC m=+163.227760507" observedRunningTime="2026-04-17 17:26:38.14848805 +0000 UTC m=+164.119771403" watchObservedRunningTime="2026-04-17 17:26:38.150135682 +0000 UTC m=+164.121419023" Apr 17 17:26:38.792085 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:38.792039 2580 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0: reading manifest sha256:6d8e3c5fb28d234506e9abc30bc21fc6dcd21d3a080cc16a7fce47b52fc92090 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0" Apr 17 17:26:38.792384 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:38.792320 2580 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:console,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0,Command:[/opt/bridge/bin/bridge --public-dir=/opt/bridge/static --config=/var/console-config/console-config.yaml --service-ca-file=/var/service-ca/service-ca.crt --v=2],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{104857600 0} {} 100Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:console-serving-cert,ReadOnly:true,MountPath:/var/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:console-oauth-config,ReadOnly:true,MountPath:/var/oauth-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:console-config,ReadOnly:true,MountPath:/var/console-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:service-ca,ReadOnly:true,MountPath:/var/service-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:oauth-serving-cert,ReadOnly:true,MountPath:/var/oauth-serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c6chr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:1,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[sleep 25],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},StopSignal:nil,},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000210000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:30,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod console-85b564b9bd-ljx2k_openshift-console(d78f7f76-a27f-47b0-8ede-0c1d18d956ef): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0: reading manifest sha256:6d8e3c5fb28d234506e9abc30bc21fc6dcd21d3a080cc16a7fce47b52fc92090 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:26:38.794205 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:38.794176 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0: reading manifest sha256:6d8e3c5fb28d234506e9abc30bc21fc6dcd21d3a080cc16a7fce47b52fc92090 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-console/console-85b564b9bd-ljx2k" podUID="d78f7f76-a27f-47b0-8ede-0c1d18d956ef" Apr 17 17:26:39.135864 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:39.135835 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0: reading manifest sha256:6d8e3c5fb28d234506e9abc30bc21fc6dcd21d3a080cc16a7fce47b52fc92090 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-console/console-85b564b9bd-ljx2k" podUID="d78f7f76-a27f-47b0-8ede-0c1d18d956ef" Apr 17 17:26:39.913251 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.913215 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx"] Apr 17 17:26:39.916920 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.916900 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:39.920321 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.920299 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 17:26:39.921140 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.920904 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:26:39.921140 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.920945 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:26:39.921140 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.921046 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:26:39.921140 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.921087 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:26:39.921899 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.921880 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-6l76j\"" Apr 17 17:26:39.932675 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.932652 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx"] Apr 17 17:26:39.960067 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.960032 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-njmkn"] Apr 17 17:26:39.962605 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.962565 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:39.965740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.965679 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:26:39.965885 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.965851 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:26:39.966082 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.966064 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s79cd\"" Apr 17 17:26:39.967022 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.966978 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:26:39.977502 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.977480 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-5dqjj"] Apr 17 17:26:39.980050 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.980033 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:39.982716 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.982695 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 17:26:39.983422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.983405 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:26:39.984034 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.984013 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 17:26:39.984153 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.984104 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-wmqfv\"" Apr 17 17:26:39.998356 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:39.998332 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-5dqjj"] Apr 17 17:26:40.011100 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011068 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-accelerators-collector-config\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.011257 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011114 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1fa41c7d-0098-4482-ad82-48d0da635522-root\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.011257 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011166 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fa41c7d-0098-4482-ad82-48d0da635522-metrics-client-ca\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.011257 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011216 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nzhd\" (UniqueName: \"kubernetes.io/projected/1fa41c7d-0098-4482-ad82-48d0da635522-kube-api-access-7nzhd\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.011465 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011306 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2f2w\" (UniqueName: \"kubernetes.io/projected/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-api-access-q2f2w\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.011465 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011361 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/efba4076-5827-49c0-8be0-d5c74c47988c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.011465 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011410 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efba4076-5827-49c0-8be0-d5c74c47988c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.011465 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011440 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.011694 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011483 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fa41c7d-0098-4482-ad82-48d0da635522-sys\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.011694 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011557 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-tls\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.011694 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011608 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.011694 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-wtmp\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.011838 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011710 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.011838 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011747 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.011838 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011776 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64vwx\" (UniqueName: \"kubernetes.io/projected/efba4076-5827-49c0-8be0-d5c74c47988c-kube-api-access-64vwx\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.011838 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011805 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4cb39368-8e2f-4db5-bae2-1b7b4455394f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.012000 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011837 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4cb39368-8e2f-4db5-bae2-1b7b4455394f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.012000 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011861 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-textfile\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.012000 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.011888 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/efba4076-5827-49c0-8be0-d5c74c47988c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.112480 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112444 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efba4076-5827-49c0-8be0-d5c74c47988c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.112480 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112481 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.112742 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112505 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fa41c7d-0098-4482-ad82-48d0da635522-sys\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.112742 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112593 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1fa41c7d-0098-4482-ad82-48d0da635522-sys\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.112742 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112702 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-tls\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.112742 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112737 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.112939 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112784 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-wtmp\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.112939 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112818 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.112939 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112867 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.112939 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64vwx\" (UniqueName: \"kubernetes.io/projected/efba4076-5827-49c0-8be0-d5c74c47988c-kube-api-access-64vwx\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.113129 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112945 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4cb39368-8e2f-4db5-bae2-1b7b4455394f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.113129 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.112981 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4cb39368-8e2f-4db5-bae2-1b7b4455394f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.113129 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113021 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-textfile\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.113129 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113024 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-wtmp\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.113129 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113053 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/efba4076-5827-49c0-8be0-d5c74c47988c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.113129 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113099 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-accelerators-collector-config\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.113129 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:40.113119 2580 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:26:40.113450 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113135 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1fa41c7d-0098-4482-ad82-48d0da635522-root\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.113450 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:40.113176 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-tls podName:1fa41c7d-0098-4482-ad82-48d0da635522 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:40.613156048 +0000 UTC m=+166.584439375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-tls") pod "node-exporter-njmkn" (UID: "1fa41c7d-0098-4482-ad82-48d0da635522") : secret "node-exporter-tls" not found Apr 17 17:26:40.113450 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113199 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1fa41c7d-0098-4482-ad82-48d0da635522-root\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.113450 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113212 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fa41c7d-0098-4482-ad82-48d0da635522-metrics-client-ca\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.113450 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113246 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nzhd\" (UniqueName: \"kubernetes.io/projected/1fa41c7d-0098-4482-ad82-48d0da635522-kube-api-access-7nzhd\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.113450 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113284 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/efba4076-5827-49c0-8be0-d5c74c47988c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.113450 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113294 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2f2w\" (UniqueName: \"kubernetes.io/projected/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-api-access-q2f2w\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.113450 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113336 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/efba4076-5827-49c0-8be0-d5c74c47988c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.113864 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.113615 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4cb39368-8e2f-4db5-bae2-1b7b4455394f-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.115012 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:40.114217 2580 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 17:26:40.115012 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.114251 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.115012 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:40.114279 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efba4076-5827-49c0-8be0-d5c74c47988c-openshift-state-metrics-tls podName:efba4076-5827-49c0-8be0-d5c74c47988c nodeName:}" failed. No retries permitted until 2026-04-17 17:26:40.614262741 +0000 UTC m=+166.585546068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/efba4076-5827-49c0-8be0-d5c74c47988c-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-mjgmx" (UID: "efba4076-5827-49c0-8be0-d5c74c47988c") : secret "openshift-state-metrics-tls" not found Apr 17 17:26:40.115012 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.114215 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1fa41c7d-0098-4482-ad82-48d0da635522-metrics-client-ca\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.115012 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.114552 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-textfile\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.115012 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.114883 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4cb39368-8e2f-4db5-bae2-1b7b4455394f-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.115012 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.114972 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-accelerators-collector-config\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.117084 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.117043 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/efba4076-5827-49c0-8be0-d5c74c47988c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.117349 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.117276 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.117730 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.117693 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.118062 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.118038 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.125348 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.125320 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nzhd\" (UniqueName: \"kubernetes.io/projected/1fa41c7d-0098-4482-ad82-48d0da635522-kube-api-access-7nzhd\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.125865 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.125822 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2f2w\" (UniqueName: \"kubernetes.io/projected/4cb39368-8e2f-4db5-bae2-1b7b4455394f-kube-api-access-q2f2w\") pod \"kube-state-metrics-69db897b98-5dqjj\" (UID: \"4cb39368-8e2f-4db5-bae2-1b7b4455394f\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.126741 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.126717 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64vwx\" (UniqueName: \"kubernetes.io/projected/efba4076-5827-49c0-8be0-d5c74c47988c-kube-api-access-64vwx\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.290158 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.290064 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" Apr 17 17:26:40.417691 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.417660 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-5dqjj"] Apr 17 17:26:40.421161 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:40.421135 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cb39368_8e2f_4db5_bae2_1b7b4455394f.slice/crio-d9b217791a9cbfb065482a58e946a8d4ace37f00cc44af585f607d27d51bd113 WatchSource:0}: Error finding container d9b217791a9cbfb065482a58e946a8d4ace37f00cc44af585f607d27d51bd113: Status 404 returned error can't find the container with id d9b217791a9cbfb065482a58e946a8d4ace37f00cc44af585f607d27d51bd113 Apr 17 17:26:40.617618 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.617568 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-tls\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.617824 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.617754 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/efba4076-5827-49c0-8be0-d5c74c47988c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.619923 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.619896 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1fa41c7d-0098-4482-ad82-48d0da635522-node-exporter-tls\") pod \"node-exporter-njmkn\" (UID: \"1fa41c7d-0098-4482-ad82-48d0da635522\") " pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.619923 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.619914 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/efba4076-5827-49c0-8be0-d5c74c47988c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-mjgmx\" (UID: \"efba4076-5827-49c0-8be0-d5c74c47988c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.829104 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.829067 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" Apr 17 17:26:40.872430 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.872349 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-njmkn" Apr 17 17:26:40.883104 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:40.883050 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa41c7d_0098_4482_ad82_48d0da635522.slice/crio-807c19323c290a0897d9e855bfe8b55256dfc12151491dd4af4cc58a79abb694 WatchSource:0}: Error finding container 807c19323c290a0897d9e855bfe8b55256dfc12151491dd4af4cc58a79abb694: Status 404 returned error can't find the container with id 807c19323c290a0897d9e855bfe8b55256dfc12151491dd4af4cc58a79abb694 Apr 17 17:26:40.972378 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:40.972337 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx"] Apr 17 17:26:40.976959 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:40.976927 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefba4076_5827_49c0_8be0_d5c74c47988c.slice/crio-c6d6be25e1fc6c16d52c3fdeb97493d645ecc5059576d4180f67a8dcd8c59897 WatchSource:0}: Error finding container c6d6be25e1fc6c16d52c3fdeb97493d645ecc5059576d4180f67a8dcd8c59897: Status 404 returned error can't find the container with id c6d6be25e1fc6c16d52c3fdeb97493d645ecc5059576d4180f67a8dcd8c59897 Apr 17 17:26:41.048192 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.048165 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:26:41.051624 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.051601 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.055945 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.054846 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 17:26:41.055945 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.054954 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 17:26:41.055945 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.055074 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 17:26:41.055945 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.055147 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 17:26:41.055945 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.054846 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 17:26:41.055945 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.055261 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 17:26:41.055945 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.055414 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 17:26:41.055945 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.055530 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-z4rfh\"" Apr 17 17:26:41.055945 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.055627 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 17:26:41.057124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.056549 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 17:26:41.084175 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.084127 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:26:41.122645 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.122559 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-config-volume\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.122645 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.122623 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.122827 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.122661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.122827 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.122687 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.122827 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.122741 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-web-config\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.122827 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.122800 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.123035 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.122843 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.123035 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.122886 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4g8t\" (UniqueName: \"kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-kube-api-access-f4g8t\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.123035 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.122912 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.123035 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.122962 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.123035 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.122989 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-config-out\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.123035 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.123012 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.123350 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.123059 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.142647 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.142612 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" event={"ID":"efba4076-5827-49c0-8be0-d5c74c47988c","Type":"ContainerStarted","Data":"226ce338c0b37104db8c376ed1659aa2a4ffa2b4eef1e452c682dde4868907f1"} Apr 17 17:26:41.142783 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.142658 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" event={"ID":"efba4076-5827-49c0-8be0-d5c74c47988c","Type":"ContainerStarted","Data":"9fa0e2dbc8d824d7c835a7117b6e2bb6ddd1f0f866c405472bb1473f4cb668af"} Apr 17 17:26:41.142783 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.142674 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" event={"ID":"efba4076-5827-49c0-8be0-d5c74c47988c","Type":"ContainerStarted","Data":"c6d6be25e1fc6c16d52c3fdeb97493d645ecc5059576d4180f67a8dcd8c59897"} Apr 17 17:26:41.143845 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.143818 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" event={"ID":"4cb39368-8e2f-4db5-bae2-1b7b4455394f","Type":"ContainerStarted","Data":"d9b217791a9cbfb065482a58e946a8d4ace37f00cc44af585f607d27d51bd113"} Apr 17 17:26:41.145387 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.145358 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-njmkn" event={"ID":"1fa41c7d-0098-4482-ad82-48d0da635522","Type":"ContainerStarted","Data":"807c19323c290a0897d9e855bfe8b55256dfc12151491dd4af4cc58a79abb694"} Apr 17 17:26:41.223598 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223548 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-config-volume\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.223774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223615 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.223774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223643 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.223774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223659 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.223774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223682 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-web-config\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.223774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223727 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.223774 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223757 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.224066 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223794 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4g8t\" (UniqueName: \"kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-kube-api-access-f4g8t\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.224066 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223820 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.224066 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223867 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.224066 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223892 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-config-out\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.224066 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223914 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.224066 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.223959 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.225553 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.225193 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.226756 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.226706 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.226756 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.226734 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.226926 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.226818 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-config-volume\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.226989 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.226976 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.227497 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.227472 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.227497 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.227480 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.227633 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.227544 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.227721 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.227699 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-config-out\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.227817 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.227802 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.228291 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.228276 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.228887 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.228866 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-web-config\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.232043 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.232021 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4g8t\" (UniqueName: \"kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-kube-api-access-f4g8t\") pod \"alertmanager-main-0\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.365116 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.365076 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:26:41.855572 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:41.855217 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:26:41.860511 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:41.860461 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1da773b6_e4ed_4f95_b2b0_665baf696140.slice/crio-b2eac114ae8ca229da39e9deaa3098ef77a26542f792a7854af6c3e4efea994f WatchSource:0}: Error finding container b2eac114ae8ca229da39e9deaa3098ef77a26542f792a7854af6c3e4efea994f: Status 404 returned error can't find the container with id b2eac114ae8ca229da39e9deaa3098ef77a26542f792a7854af6c3e4efea994f Apr 17 17:26:42.150265 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:42.150225 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" event={"ID":"4cb39368-8e2f-4db5-bae2-1b7b4455394f","Type":"ContainerStarted","Data":"33f876df7e26b103b07aeead900dee9f151fe8d87c70174ec3b020b6f28c11eb"} Apr 17 17:26:42.150265 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:42.150270 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" event={"ID":"4cb39368-8e2f-4db5-bae2-1b7b4455394f","Type":"ContainerStarted","Data":"200f6ba31bc3190fe2d311d97ce91cd90e6ebb3c6c1cb2e4f5657a69e9cfcc56"} Apr 17 17:26:42.150500 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:42.150285 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" event={"ID":"4cb39368-8e2f-4db5-bae2-1b7b4455394f","Type":"ContainerStarted","Data":"483fe6b17b2cfaa397735cc193e4a70cec536ec73065c897ab5797b1a3f2767d"} Apr 17 17:26:42.151459 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:42.151434 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerStarted","Data":"b2eac114ae8ca229da39e9deaa3098ef77a26542f792a7854af6c3e4efea994f"} Apr 17 17:26:42.153013 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:42.152981 2580 generic.go:358] "Generic (PLEG): container finished" podID="1fa41c7d-0098-4482-ad82-48d0da635522" containerID="4f833be905279579a6e6804f79a45ab8cd12324be693f8ef2e2ce3084f98ca4a" exitCode=0 Apr 17 17:26:42.153134 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:42.153057 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-njmkn" event={"ID":"1fa41c7d-0098-4482-ad82-48d0da635522","Type":"ContainerDied","Data":"4f833be905279579a6e6804f79a45ab8cd12324be693f8ef2e2ce3084f98ca4a"} Apr 17 17:26:42.175088 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:42.174361 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dqjj" podStartSLOduration=1.880830526 podStartE2EDuration="3.174342186s" podCreationTimestamp="2026-04-17 17:26:39 +0000 UTC" firstStartedPulling="2026-04-17 17:26:40.423397796 +0000 UTC m=+166.394681115" lastFinishedPulling="2026-04-17 17:26:41.71690944 +0000 UTC m=+167.688192775" observedRunningTime="2026-04-17 17:26:42.173239187 +0000 UTC m=+168.144522529" watchObservedRunningTime="2026-04-17 17:26:42.174342186 +0000 UTC m=+168.145625551" Apr 17 17:26:43.157317 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:43.157281 2580 generic.go:358] "Generic (PLEG): container finished" podID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerID="27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40" exitCode=0 Apr 17 17:26:43.157803 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:43.157363 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerDied","Data":"27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40"} Apr 17 17:26:43.159471 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:43.159446 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-njmkn" event={"ID":"1fa41c7d-0098-4482-ad82-48d0da635522","Type":"ContainerStarted","Data":"841d96062d53f2fd3f174316b27fe38e8f6ad2116ad1c21f572289cd7bbcee33"} Apr 17 17:26:43.159471 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:43.159479 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-njmkn" event={"ID":"1fa41c7d-0098-4482-ad82-48d0da635522","Type":"ContainerStarted","Data":"246a60fd62f1d1366b390b689172de7e781fd9a841ebd4ec8c2234572e7221a9"} Apr 17 17:26:43.161478 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:43.161456 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" event={"ID":"efba4076-5827-49c0-8be0-d5c74c47988c","Type":"ContainerStarted","Data":"aeca82c2f921b0d0f8eddf693ebe0005528d98f2d1a61cc8dd4955331668103c"} Apr 17 17:26:43.227939 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:43.227833 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-mjgmx" podStartSLOduration=2.995601062 podStartE2EDuration="4.227812398s" podCreationTimestamp="2026-04-17 17:26:39 +0000 UTC" firstStartedPulling="2026-04-17 17:26:41.124959282 +0000 UTC m=+167.096242624" lastFinishedPulling="2026-04-17 17:26:42.357170639 +0000 UTC m=+168.328453960" observedRunningTime="2026-04-17 17:26:43.226687292 +0000 UTC m=+169.197970634" watchObservedRunningTime="2026-04-17 17:26:43.227812398 +0000 UTC m=+169.199095744" Apr 17 17:26:43.245763 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:43.245697 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-njmkn" podStartSLOduration=3.415906426 podStartE2EDuration="4.245678791s" podCreationTimestamp="2026-04-17 17:26:39 +0000 UTC" firstStartedPulling="2026-04-17 17:26:40.887327887 +0000 UTC m=+166.858611222" lastFinishedPulling="2026-04-17 17:26:41.717100263 +0000 UTC m=+167.688383587" observedRunningTime="2026-04-17 17:26:43.24520603 +0000 UTC m=+169.216489439" watchObservedRunningTime="2026-04-17 17:26:43.245678791 +0000 UTC m=+169.216962132" Apr 17 17:26:44.028271 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:44.028221 2580 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0" Apr 17 17:26:44.028550 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:44.028481 2580 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:console,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0,Command:[/opt/bridge/bin/bridge --public-dir=/opt/bridge/static --config=/var/console-config/console-config.yaml --service-ca-file=/var/service-ca/service-ca.crt --v=2],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{104857600 0} {} 100Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:console-serving-cert,ReadOnly:true,MountPath:/var/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:console-oauth-config,ReadOnly:true,MountPath:/var/oauth-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:console-config,ReadOnly:true,MountPath:/var/console-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:service-ca,ReadOnly:true,MountPath:/var/service-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:trusted-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:oauth-serving-cert,ReadOnly:true,MountPath:/var/oauth-serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swmzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:1,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[sleep 25],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},StopSignal:nil,},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000210000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:30,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod console-5865f69957-bxscz_openshift-console(e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:26:44.029682 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:44.029650 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-console/console-5865f69957-bxscz" podUID="e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c" Apr 17 17:26:44.165944 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:44.165898 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0cc2dc261f075be17ea31eb148cce7fc0b11a4dc06add53d19e4f39df155ba0: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-console/console-5865f69957-bxscz" podUID="e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c" Apr 17 17:26:44.756420 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:44.756338 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-s954w"] Apr 17 17:26:44.758726 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:44.758703 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s954w" Apr 17 17:26:44.761183 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:44.761158 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 17:26:44.761262 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:44.761170 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-j26hq\"" Apr 17 17:26:44.768070 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:44.768048 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-s954w"] Apr 17 17:26:44.818143 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:44.818111 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85b564b9bd-ljx2k"] Apr 17 17:26:44.857888 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:44.857848 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a67cfe6e-d4aa-4c24-9313-a4be369b3f41-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-s954w\" (UID: \"a67cfe6e-d4aa-4c24-9313-a4be369b3f41\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s954w" Apr 17 17:26:44.952834 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:44.952809 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:44.959736 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:44.959707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a67cfe6e-d4aa-4c24-9313-a4be369b3f41-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-s954w\" (UID: \"a67cfe6e-d4aa-4c24-9313-a4be369b3f41\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s954w" Apr 17 17:26:44.959869 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:44.959851 2580 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 17:26:44.959936 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:44.959922 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a67cfe6e-d4aa-4c24-9313-a4be369b3f41-monitoring-plugin-cert podName:a67cfe6e-d4aa-4c24-9313-a4be369b3f41 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:45.459900999 +0000 UTC m=+171.431184332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/a67cfe6e-d4aa-4c24-9313-a4be369b3f41-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-s954w" (UID: "a67cfe6e-d4aa-4c24-9313-a4be369b3f41") : secret "monitoring-plugin-cert" not found Apr 17 17:26:45.043608 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.043504 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-c5699c7c9-9txtc"] Apr 17 17:26:45.046238 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.046216 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.048843 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.048790 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 17:26:45.048964 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.048898 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 17:26:45.049055 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.049033 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 17:26:45.049164 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.049149 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 17:26:45.049252 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.049236 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-tkgrm\"" Apr 17 17:26:45.049309 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.049264 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 17:26:45.061847 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.061819 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-c5699c7c9-9txtc"] Apr 17 17:26:45.063647 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.062032 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-oauth-serving-cert\") pod \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " Apr 17 17:26:45.063647 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.063014 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-config\") pod \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " Apr 17 17:26:45.063647 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.063061 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-oauth-config\") pod \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " Apr 17 17:26:45.063647 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.063099 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-service-ca\") pod \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " Apr 17 17:26:45.063647 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.063166 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6chr\" (UniqueName: \"kubernetes.io/projected/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-kube-api-access-c6chr\") pod \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " Apr 17 17:26:45.063647 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.063228 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-serving-cert\") pod \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\" (UID: \"d78f7f76-a27f-47b0-8ede-0c1d18d956ef\") " Apr 17 17:26:45.063647 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.062379 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d78f7f76-a27f-47b0-8ede-0c1d18d956ef" (UID: "d78f7f76-a27f-47b0-8ede-0c1d18d956ef"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:45.065840 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.064670 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 17:26:45.065840 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.064806 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-config" (OuterVolumeSpecName: "console-config") pod "d78f7f76-a27f-47b0-8ede-0c1d18d956ef" (UID: "d78f7f76-a27f-47b0-8ede-0c1d18d956ef"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:45.065840 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.064916 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-service-ca" (OuterVolumeSpecName: "service-ca") pod "d78f7f76-a27f-47b0-8ede-0c1d18d956ef" (UID: "d78f7f76-a27f-47b0-8ede-0c1d18d956ef"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:45.066865 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.066837 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d78f7f76-a27f-47b0-8ede-0c1d18d956ef" (UID: "d78f7f76-a27f-47b0-8ede-0c1d18d956ef"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:45.067044 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.067016 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-kube-api-access-c6chr" (OuterVolumeSpecName: "kube-api-access-c6chr") pod "d78f7f76-a27f-47b0-8ede-0c1d18d956ef" (UID: "d78f7f76-a27f-47b0-8ede-0c1d18d956ef"). InnerVolumeSpecName "kube-api-access-c6chr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:45.068618 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.068564 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d78f7f76-a27f-47b0-8ede-0c1d18d956ef" (UID: "d78f7f76-a27f-47b0-8ede-0c1d18d956ef"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:45.164525 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164486 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.164740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164560 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88afd659-9be2-49eb-b958-426fa64e4320-serving-certs-ca-bundle\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.164740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164631 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vr4k\" (UniqueName: \"kubernetes.io/projected/88afd659-9be2-49eb-b958-426fa64e4320-kube-api-access-6vr4k\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.164740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164655 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88afd659-9be2-49eb-b958-426fa64e4320-metrics-client-ca\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.164740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164679 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-secret-telemeter-client\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.164740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164706 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88afd659-9be2-49eb-b958-426fa64e4320-telemeter-trusted-ca-bundle\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.165011 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164838 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-federate-client-tls\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.165011 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164898 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-telemeter-client-tls\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.165011 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164936 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-service-ca\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:45.165011 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164952 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c6chr\" (UniqueName: \"kubernetes.io/projected/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-kube-api-access-c6chr\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:45.165011 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164966 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-serving-cert\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:45.165011 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164975 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-oauth-serving-cert\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:45.165011 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164984 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:45.165011 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.164993 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d78f7f76-a27f-47b0-8ede-0c1d18d956ef-console-oauth-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:45.167680 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.167637 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b564b9bd-ljx2k" event={"ID":"d78f7f76-a27f-47b0-8ede-0c1d18d956ef","Type":"ContainerDied","Data":"c0bf91aee6872cd000b43b94eeb6748755e6a797e00ed428e0ce3d042c68e1fa"} Apr 17 17:26:45.167680 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.167677 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b564b9bd-ljx2k" Apr 17 17:26:45.171033 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.171005 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerStarted","Data":"0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355"} Apr 17 17:26:45.171176 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.171044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerStarted","Data":"3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb"} Apr 17 17:26:45.171176 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.171057 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerStarted","Data":"8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5"} Apr 17 17:26:45.171176 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.171070 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerStarted","Data":"409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d"} Apr 17 17:26:45.171176 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.171081 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerStarted","Data":"1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53"} Apr 17 17:26:45.207773 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.207744 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85b564b9bd-ljx2k"] Apr 17 17:26:45.211200 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.211169 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85b564b9bd-ljx2k"] Apr 17 17:26:45.266123 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.266083 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-federate-client-tls\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.266273 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.266181 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-telemeter-client-tls\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.266273 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.266218 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.266355 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.266275 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88afd659-9be2-49eb-b958-426fa64e4320-serving-certs-ca-bundle\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.266355 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.266314 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vr4k\" (UniqueName: \"kubernetes.io/projected/88afd659-9be2-49eb-b958-426fa64e4320-kube-api-access-6vr4k\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.266355 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.266343 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88afd659-9be2-49eb-b958-426fa64e4320-metrics-client-ca\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.266446 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.266369 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-secret-telemeter-client\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.266446 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.266401 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88afd659-9be2-49eb-b958-426fa64e4320-telemeter-trusted-ca-bundle\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.267071 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.267040 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88afd659-9be2-49eb-b958-426fa64e4320-serving-certs-ca-bundle\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.267242 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.267198 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88afd659-9be2-49eb-b958-426fa64e4320-metrics-client-ca\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.267441 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.267417 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88afd659-9be2-49eb-b958-426fa64e4320-telemeter-trusted-ca-bundle\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.268736 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.268710 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-telemeter-client-tls\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.268856 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.268779 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.268919 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.268890 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-federate-client-tls\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.269011 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.268991 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/88afd659-9be2-49eb-b958-426fa64e4320-secret-telemeter-client\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.274405 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.274385 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vr4k\" (UniqueName: \"kubernetes.io/projected/88afd659-9be2-49eb-b958-426fa64e4320-kube-api-access-6vr4k\") pod \"telemeter-client-c5699c7c9-9txtc\" (UID: \"88afd659-9be2-49eb-b958-426fa64e4320\") " pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.361895 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.361863 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" Apr 17 17:26:45.469134 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.469104 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a67cfe6e-d4aa-4c24-9313-a4be369b3f41-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-s954w\" (UID: \"a67cfe6e-d4aa-4c24-9313-a4be369b3f41\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s954w" Apr 17 17:26:45.471768 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.471737 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a67cfe6e-d4aa-4c24-9313-a4be369b3f41-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-s954w\" (UID: \"a67cfe6e-d4aa-4c24-9313-a4be369b3f41\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s954w" Apr 17 17:26:45.507916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.507886 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-c5699c7c9-9txtc"] Apr 17 17:26:45.511404 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:45.511375 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88afd659_9be2_49eb_b958_426fa64e4320.slice/crio-7ab3614f8c87cea2d6c54adaeb9f4a6c638483a3815fb1ac2f0b4dbabf560822 WatchSource:0}: Error finding container 7ab3614f8c87cea2d6c54adaeb9f4a6c638483a3815fb1ac2f0b4dbabf560822: Status 404 returned error can't find the container with id 7ab3614f8c87cea2d6c54adaeb9f4a6c638483a3815fb1ac2f0b4dbabf560822 Apr 17 17:26:45.636132 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.636096 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:26:45.669295 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.669260 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s954w" Apr 17 17:26:45.784965 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:45.784935 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-s954w"] Apr 17 17:26:45.787771 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:45.787740 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda67cfe6e_d4aa_4c24_9313_a4be369b3f41.slice/crio-87b70a199049d37802c76669180063988a5eb343d911512c21b591d8df99e7c7 WatchSource:0}: Error finding container 87b70a199049d37802c76669180063988a5eb343d911512c21b591d8df99e7c7: Status 404 returned error can't find the container with id 87b70a199049d37802c76669180063988a5eb343d911512c21b591d8df99e7c7 Apr 17 17:26:46.175943 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:46.175907 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" event={"ID":"88afd659-9be2-49eb-b958-426fa64e4320","Type":"ContainerStarted","Data":"7ab3614f8c87cea2d6c54adaeb9f4a6c638483a3815fb1ac2f0b4dbabf560822"} Apr 17 17:26:46.179831 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:46.179797 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerStarted","Data":"942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7"} Apr 17 17:26:46.183595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:46.183554 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s954w" event={"ID":"a67cfe6e-d4aa-4c24-9313-a4be369b3f41","Type":"ContainerStarted","Data":"87b70a199049d37802c76669180063988a5eb343d911512c21b591d8df99e7c7"} Apr 17 17:26:46.214535 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:46.214475 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.790262153 podStartE2EDuration="5.214455661s" podCreationTimestamp="2026-04-17 17:26:41 +0000 UTC" firstStartedPulling="2026-04-17 17:26:41.8629091 +0000 UTC m=+167.834192424" lastFinishedPulling="2026-04-17 17:26:45.287102612 +0000 UTC m=+171.258385932" observedRunningTime="2026-04-17 17:26:46.212964429 +0000 UTC m=+172.184247807" watchObservedRunningTime="2026-04-17 17:26:46.214455661 +0000 UTC m=+172.185739004" Apr 17 17:26:46.642563 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:46.642526 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d78f7f76-a27f-47b0-8ede-0c1d18d956ef" path="/var/lib/kubelet/pods/d78f7f76-a27f-47b0-8ede-0c1d18d956ef/volumes" Apr 17 17:26:48.191549 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:48.191509 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s954w" event={"ID":"a67cfe6e-d4aa-4c24-9313-a4be369b3f41","Type":"ContainerStarted","Data":"e6684c6191046491d809df4846711607adf540cb048b9972120625914ac6c9d4"} Apr 17 17:26:48.192023 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:48.191724 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s954w" Apr 17 17:26:48.193433 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:48.193399 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" event={"ID":"88afd659-9be2-49eb-b958-426fa64e4320","Type":"ContainerStarted","Data":"ee6acb4722f8260159fae5d81d0a238c812da8e2f82ed7ae5d09dd3303e7cfb7"} Apr 17 17:26:48.193556 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:48.193436 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" event={"ID":"88afd659-9be2-49eb-b958-426fa64e4320","Type":"ContainerStarted","Data":"2459b812fa22e04fc2c109b7013baa4cd44429303c1d59ef6a66cc4e9aa69522"} Apr 17 17:26:48.193556 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:48.193449 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" event={"ID":"88afd659-9be2-49eb-b958-426fa64e4320","Type":"ContainerStarted","Data":"46d41cc00a1f6999c347fbbc7698dc61cbd2b7e6de5d0f704e0bb28b92ad791e"} Apr 17 17:26:48.196982 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:48.196961 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s954w" Apr 17 17:26:48.208624 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:48.208558 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-s954w" podStartSLOduration=2.648315083 podStartE2EDuration="4.2085462s" podCreationTimestamp="2026-04-17 17:26:44 +0000 UTC" firstStartedPulling="2026-04-17 17:26:45.789444794 +0000 UTC m=+171.760728116" lastFinishedPulling="2026-04-17 17:26:47.349675901 +0000 UTC m=+173.320959233" observedRunningTime="2026-04-17 17:26:48.20707433 +0000 UTC m=+174.178357671" watchObservedRunningTime="2026-04-17 17:26:48.2085462 +0000 UTC m=+174.179829586" Apr 17 17:26:48.229103 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:48.229049 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-c5699c7c9-9txtc" podStartSLOduration=1.392769075 podStartE2EDuration="3.229032947s" podCreationTimestamp="2026-04-17 17:26:45 +0000 UTC" firstStartedPulling="2026-04-17 17:26:45.513357321 +0000 UTC m=+171.484640640" lastFinishedPulling="2026-04-17 17:26:47.34962119 +0000 UTC m=+173.320904512" observedRunningTime="2026-04-17 17:26:48.226638231 +0000 UTC m=+174.197921573" watchObservedRunningTime="2026-04-17 17:26:48.229032947 +0000 UTC m=+174.200316288" Apr 17 17:26:49.611189 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.611154 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5865f69957-bxscz"] Apr 17 17:26:49.641425 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.641392 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-576f4646d-j9plt"] Apr 17 17:26:49.643727 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.643704 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.655862 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.655839 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-576f4646d-j9plt"] Apr 17 17:26:49.707073 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.707037 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-service-ca\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.707225 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.707133 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-trusted-ca-bundle\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.707225 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.707187 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-serving-cert\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.707294 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.707273 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-oauth-config\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.707327 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.707305 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-oauth-serving-cert\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.707393 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.707370 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-config\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.707431 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.707418 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thptk\" (UniqueName: \"kubernetes.io/projected/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-kube-api-access-thptk\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.729128 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.729107 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:49.808407 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808377 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swmzp\" (UniqueName: \"kubernetes.io/projected/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-kube-api-access-swmzp\") pod \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " Apr 17 17:26:49.808594 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808417 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-service-ca\") pod \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " Apr 17 17:26:49.808594 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808460 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-oauth-config\") pod \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " Apr 17 17:26:49.808594 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808494 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-trusted-ca-bundle\") pod \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " Apr 17 17:26:49.808594 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808528 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-config\") pod \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " Apr 17 17:26:49.808797 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808613 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-serving-cert\") pod \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " Apr 17 17:26:49.808797 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808651 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-oauth-serving-cert\") pod \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\" (UID: \"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c\") " Apr 17 17:26:49.808797 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808761 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-config\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.808941 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808812 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thptk\" (UniqueName: \"kubernetes.io/projected/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-kube-api-access-thptk\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.808941 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808871 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-service-ca\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.808941 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808904 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-trusted-ca-bundle\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.808941 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808912 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-service-ca" (OuterVolumeSpecName: "service-ca") pod "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c" (UID: "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:49.809122 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.808949 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-serving-cert\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.809122 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.809030 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-oauth-config\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.809122 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.809062 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-oauth-serving-cert\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.809122 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.809115 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-service-ca\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:49.809295 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.809171 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-config" (OuterVolumeSpecName: "console-config") pod "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c" (UID: "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:49.809295 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.809259 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c" (UID: "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:49.810962 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.809613 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-config\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.810962 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.809817 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-oauth-serving-cert\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.810962 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.810066 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-trusted-ca-bundle\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.810962 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.810523 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c" (UID: "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:49.810962 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.810559 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-service-ca\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.812151 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.812035 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-kube-api-access-swmzp" (OuterVolumeSpecName: "kube-api-access-swmzp") pod "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c" (UID: "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c"). InnerVolumeSpecName "kube-api-access-swmzp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:49.812537 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.812504 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-oauth-config\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.812966 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.812942 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c" (UID: "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:49.813070 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.813003 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c" (UID: "e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:49.813409 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.813390 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-serving-cert\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.819286 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.819263 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thptk\" (UniqueName: \"kubernetes.io/projected/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-kube-api-access-thptk\") pod \"console-576f4646d-j9plt\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:49.909794 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.909715 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-serving-cert\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:49.909794 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.909743 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-oauth-serving-cert\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:49.909794 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.909752 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-swmzp\" (UniqueName: \"kubernetes.io/projected/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-kube-api-access-swmzp\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:49.909794 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.909770 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-oauth-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:49.909794 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.909780 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-trusted-ca-bundle\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:49.910041 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.909810 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c-console-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:26:49.952879 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:49.952837 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:50.067229 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:50.067198 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-576f4646d-j9plt"] Apr 17 17:26:50.070221 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:26:50.070191 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53aa45fa_1f1d_4677_a7e1_d6d47ec78efa.slice/crio-2a4b8f8d2b9febb422b0215da4d9074ced3a4f97a46c7ace64931f38629c8b74 WatchSource:0}: Error finding container 2a4b8f8d2b9febb422b0215da4d9074ced3a4f97a46c7ace64931f38629c8b74: Status 404 returned error can't find the container with id 2a4b8f8d2b9febb422b0215da4d9074ced3a4f97a46c7ace64931f38629c8b74 Apr 17 17:26:50.205589 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:50.205472 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576f4646d-j9plt" event={"ID":"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa","Type":"ContainerStarted","Data":"2a4b8f8d2b9febb422b0215da4d9074ced3a4f97a46c7ace64931f38629c8b74"} Apr 17 17:26:50.206278 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:50.206251 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5865f69957-bxscz" event={"ID":"e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c","Type":"ContainerDied","Data":"0cd14c475bf68a2c1b519fc5565631b0f41ea42e0785e3fb1750826a4063b22c"} Apr 17 17:26:50.206396 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:50.206257 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5865f69957-bxscz" Apr 17 17:26:50.247807 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:50.247777 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5865f69957-bxscz"] Apr 17 17:26:50.254670 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:50.254645 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5865f69957-bxscz"] Apr 17 17:26:50.641263 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:50.641230 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c" path="/var/lib/kubelet/pods/e1dfc111-59e4-4a1c-8519-6e9d5f94bf6c/volumes" Apr 17 17:26:53.217349 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:53.217312 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576f4646d-j9plt" event={"ID":"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa","Type":"ContainerStarted","Data":"c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2"} Apr 17 17:26:53.235647 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:53.235602 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-576f4646d-j9plt" podStartSLOduration=1.757017463 podStartE2EDuration="4.235568695s" podCreationTimestamp="2026-04-17 17:26:49 +0000 UTC" firstStartedPulling="2026-04-17 17:26:50.072080416 +0000 UTC m=+176.043363734" lastFinishedPulling="2026-04-17 17:26:52.550631646 +0000 UTC m=+178.521914966" observedRunningTime="2026-04-17 17:26:53.234422912 +0000 UTC m=+179.205706254" watchObservedRunningTime="2026-04-17 17:26:53.235568695 +0000 UTC m=+179.206852036" Apr 17 17:26:55.770246 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:55.770197 2580 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c862e4b31529b530f930a8d0e2a75b53d2092f392a29d037d0312169e1d4a1ac: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: get manifest: build image source: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c862e4b31529b530f930a8d0e2a75b53d2092f392a29d037d0312169e1d4a1ac" Apr 17 17:26:55.770701 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:55.770420 2580 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:dns,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c862e4b31529b530f930a8d0e2a75b53d2092f392a29d037d0312169e1d4a1ac,Command:[coredns],Args:[-conf /etc/coredns/Corefile],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:dns,HostPort:0,ContainerPort:5353,Protocol:UDP,HostIP:,},ContainerPort{Name:dns-tcp,HostPort:0,ContainerPort:5353,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{73400320 0} {} 70Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-volume,ReadOnly:true,MountPath:/etc/coredns,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-dir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z45pm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:60,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:5,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8181 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:3,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dns-default-h5vmx_openshift-dns(c063b8d8-8182-438f-a272-69a64fcbb153): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c862e4b31529b530f930a8d0e2a75b53d2092f392a29d037d0312169e1d4a1ac: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: get manifest: build image source: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out" logger="UnhandledError" Apr 17 17:26:55.889393 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:55.889358 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c862e4b31529b530f930a8d0e2a75b53d2092f392a29d037d0312169e1d4a1ac: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: get manifest: build image source: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out\"" pod="openshift-dns/dns-default-h5vmx" podUID="c063b8d8-8182-438f-a272-69a64fcbb153" Apr 17 17:26:56.227233 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:56.227197 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h5vmx" event={"ID":"c063b8d8-8182-438f-a272-69a64fcbb153","Type":"ContainerStarted","Data":"ad7d2d8ee986313033585c4c6ae27d5bf22506a0820da55e091528d4ada9273c"} Apr 17 17:26:56.228354 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:56.228318 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c862e4b31529b530f930a8d0e2a75b53d2092f392a29d037d0312169e1d4a1ac\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c862e4b31529b530f930a8d0e2a75b53d2092f392a29d037d0312169e1d4a1ac: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: get manifest: build image source: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out\"" pod="openshift-dns/dns-default-h5vmx" podUID="c063b8d8-8182-438f-a272-69a64fcbb153" Apr 17 17:26:57.231445 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:26:57.231416 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c862e4b31529b530f930a8d0e2a75b53d2092f392a29d037d0312169e1d4a1ac\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c862e4b31529b530f930a8d0e2a75b53d2092f392a29d037d0312169e1d4a1ac: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out; artifact err: get manifest: build image source: pinging container registry quay.io: received unexpected HTTP status: 504 Gateway Time-out\"" pod="openshift-dns/dns-default-h5vmx" podUID="c063b8d8-8182-438f-a272-69a64fcbb153" Apr 17 17:26:59.953256 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:59.953212 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:59.953256 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:59.953259 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:26:59.957832 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:26:59.957810 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:27:00.241989 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:27:00.241897 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:27:01.829910 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:27:01.829877 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h5vmx_c063b8d8-8182-438f-a272-69a64fcbb153/kube-rbac-proxy/0.log" Apr 17 17:27:03.029733 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:27:03.029701 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pch4m_23dab589-f077-4e94-93bc-392122228de4/dns-node-resolver/0.log" Apr 17 17:27:03.830095 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:27:03.830061 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l6nv9_1586f132-dd9c-4636-a7c7-87b1b730dc01/serve-healthcheck-canary/0.log" Apr 17 17:27:12.273111 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:27:12.273077 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h5vmx" event={"ID":"c063b8d8-8182-438f-a272-69a64fcbb153","Type":"ContainerStarted","Data":"fff9ebd7bd117a22468eb59582468fc64ae74f1615d2b168cb86171a208f4f4f"} Apr 17 17:27:12.273515 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:27:12.273294 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-h5vmx" Apr 17 17:27:12.293124 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:27:12.293073 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-h5vmx" podStartSLOduration=129.22497468 podStartE2EDuration="2m45.293057381s" podCreationTimestamp="2026-04-17 17:24:27 +0000 UTC" firstStartedPulling="2026-04-17 17:26:35.755802133 +0000 UTC m=+161.727085457" lastFinishedPulling="2026-04-17 17:27:11.823884819 +0000 UTC m=+197.795168158" observedRunningTime="2026-04-17 17:27:12.292651781 +0000 UTC m=+198.263935123" watchObservedRunningTime="2026-04-17 17:27:12.293057381 +0000 UTC m=+198.264340722" Apr 17 17:27:22.279002 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:27:22.278970 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-h5vmx" Apr 17 17:28:00.304142 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:00.304107 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:28:00.304634 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:00.304523 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="alertmanager" containerID="cri-o://1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53" gracePeriod=120 Apr 17 17:28:00.304726 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:00.304621 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy-metric" containerID="cri-o://0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355" gracePeriod=120 Apr 17 17:28:00.304726 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:00.304620 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy-web" containerID="cri-o://8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5" gracePeriod=120 Apr 17 17:28:00.304800 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:00.304630 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="prom-label-proxy" containerID="cri-o://942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7" gracePeriod=120 Apr 17 17:28:00.308027 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:00.304987 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy" containerID="cri-o://3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb" gracePeriod=120 Apr 17 17:28:00.308027 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:00.304661 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="config-reloader" containerID="cri-o://409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d" gracePeriod=120 Apr 17 17:28:01.413398 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.413369 2580 generic.go:358] "Generic (PLEG): container finished" podID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerID="942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7" exitCode=0 Apr 17 17:28:01.413398 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.413393 2580 generic.go:358] "Generic (PLEG): container finished" podID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerID="3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb" exitCode=0 Apr 17 17:28:01.413398 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.413400 2580 generic.go:358] "Generic (PLEG): container finished" podID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerID="409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d" exitCode=0 Apr 17 17:28:01.413798 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.413409 2580 generic.go:358] "Generic (PLEG): container finished" podID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerID="1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53" exitCode=0 Apr 17 17:28:01.413798 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.413437 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerDied","Data":"942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7"} Apr 17 17:28:01.413798 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.413469 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerDied","Data":"3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb"} Apr 17 17:28:01.413798 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.413480 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerDied","Data":"409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d"} Apr 17 17:28:01.413798 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.413489 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerDied","Data":"1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53"} Apr 17 17:28:01.545258 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.545233 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:01.632454 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.632367 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-main-db\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.632454 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.632414 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-tls-assets\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.632454 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.632440 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-config-out\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.632759 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.632470 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-metrics-client-ca\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.632759 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.632511 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.632759 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.632547 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-main-tls\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.632759 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.632604 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-metric\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.632939 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.632825 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:28:01.632995 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.632942 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:01.632995 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.632973 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-web-config\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.633093 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.633019 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-web\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.633093 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.633070 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-cluster-tls-config\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.633192 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.633120 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-config-volume\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.633192 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.633148 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-trusted-ca-bundle\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.633280 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.633192 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4g8t\" (UniqueName: \"kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-kube-api-access-f4g8t\") pod \"1da773b6-e4ed-4f95-b2b0-665baf696140\" (UID: \"1da773b6-e4ed-4f95-b2b0-665baf696140\") " Apr 17 17:28:01.633787 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.633463 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-main-db\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.633787 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.633489 2580 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-metrics-client-ca\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.635222 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.635193 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:01.635313 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.635259 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:01.635313 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.635298 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-config-out" (OuterVolumeSpecName: "config-out") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:28:01.635702 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.635633 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:01.635702 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.635675 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:01.635923 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.635899 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:01.636962 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.636930 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-kube-api-access-f4g8t" (OuterVolumeSpecName: "kube-api-access-f4g8t") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "kube-api-access-f4g8t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:01.637067 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.637027 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:01.637702 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.637675 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-config-volume" (OuterVolumeSpecName: "config-volume") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:01.639905 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.639793 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:01.646318 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.646292 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-web-config" (OuterVolumeSpecName: "web-config") pod "1da773b6-e4ed-4f95-b2b0-665baf696140" (UID: "1da773b6-e4ed-4f95-b2b0-665baf696140"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:01.734235 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.734199 2580 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-cluster-tls-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.734422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.734304 2580 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-config-volume\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.734422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.734328 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da773b6-e4ed-4f95-b2b0-665baf696140-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.734422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.734345 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f4g8t\" (UniqueName: \"kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-kube-api-access-f4g8t\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.734422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.734362 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1da773b6-e4ed-4f95-b2b0-665baf696140-tls-assets\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.734422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.734376 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1da773b6-e4ed-4f95-b2b0-665baf696140-config-out\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.734422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.734393 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.734422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.734414 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-main-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.734656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.734432 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.734656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.734447 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-web-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:01.734656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:01.734462 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1da773b6-e4ed-4f95-b2b0-665baf696140-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:02.418620 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.418560 2580 generic.go:358] "Generic (PLEG): container finished" podID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerID="0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355" exitCode=0 Apr 17 17:28:02.418620 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.418608 2580 generic.go:358] "Generic (PLEG): container finished" podID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerID="8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5" exitCode=0 Apr 17 17:28:02.418620 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.418607 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerDied","Data":"0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355"} Apr 17 17:28:02.419223 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.418643 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerDied","Data":"8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5"} Apr 17 17:28:02.419223 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.418653 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1da773b6-e4ed-4f95-b2b0-665baf696140","Type":"ContainerDied","Data":"b2eac114ae8ca229da39e9deaa3098ef77a26542f792a7854af6c3e4efea994f"} Apr 17 17:28:02.419223 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.418669 2580 scope.go:117] "RemoveContainer" containerID="942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7" Apr 17 17:28:02.419223 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.418675 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.427753 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.427689 2580 scope.go:117] "RemoveContainer" containerID="0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355" Apr 17 17:28:02.439626 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.439593 2580 scope.go:117] "RemoveContainer" containerID="3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb" Apr 17 17:28:02.444350 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.444326 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:28:02.448076 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.448054 2580 scope.go:117] "RemoveContainer" containerID="8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5" Apr 17 17:28:02.450968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.450948 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:28:02.456229 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.456207 2580 scope.go:117] "RemoveContainer" containerID="409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d" Apr 17 17:28:02.464331 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.464311 2580 scope.go:117] "RemoveContainer" containerID="1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53" Apr 17 17:28:02.471504 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.471406 2580 scope.go:117] "RemoveContainer" containerID="27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40" Apr 17 17:28:02.478401 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.478231 2580 scope.go:117] "RemoveContainer" containerID="942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7" Apr 17 17:28:02.478611 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:28:02.478560 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7\": container with ID starting with 942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7 not found: ID does not exist" containerID="942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7" Apr 17 17:28:02.478702 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.478628 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7"} err="failed to get container status \"942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7\": rpc error: code = NotFound desc = could not find container \"942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7\": container with ID starting with 942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7 not found: ID does not exist" Apr 17 17:28:02.478702 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.478676 2580 scope.go:117] "RemoveContainer" containerID="0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355" Apr 17 17:28:02.479117 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:28:02.479089 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355\": container with ID starting with 0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355 not found: ID does not exist" containerID="0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355" Apr 17 17:28:02.479204 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.479120 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355"} err="failed to get container status \"0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355\": rpc error: code = NotFound desc = could not find container \"0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355\": container with ID starting with 0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355 not found: ID does not exist" Apr 17 17:28:02.479204 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.479155 2580 scope.go:117] "RemoveContainer" containerID="3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb" Apr 17 17:28:02.479434 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:28:02.479412 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb\": container with ID starting with 3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb not found: ID does not exist" containerID="3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb" Apr 17 17:28:02.479479 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.479444 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb"} err="failed to get container status \"3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb\": rpc error: code = NotFound desc = could not find container \"3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb\": container with ID starting with 3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb not found: ID does not exist" Apr 17 17:28:02.479479 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.479465 2580 scope.go:117] "RemoveContainer" containerID="8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5" Apr 17 17:28:02.479754 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:28:02.479732 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5\": container with ID starting with 8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5 not found: ID does not exist" containerID="8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5" Apr 17 17:28:02.479832 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.479756 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5"} err="failed to get container status \"8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5\": rpc error: code = NotFound desc = could not find container \"8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5\": container with ID starting with 8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5 not found: ID does not exist" Apr 17 17:28:02.479832 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.479774 2580 scope.go:117] "RemoveContainer" containerID="409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d" Apr 17 17:28:02.480032 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:28:02.480012 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d\": container with ID starting with 409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d not found: ID does not exist" containerID="409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d" Apr 17 17:28:02.480078 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480040 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d"} err="failed to get container status \"409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d\": rpc error: code = NotFound desc = could not find container \"409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d\": container with ID starting with 409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d not found: ID does not exist" Apr 17 17:28:02.480078 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480063 2580 scope.go:117] "RemoveContainer" containerID="1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53" Apr 17 17:28:02.480319 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480301 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:28:02.480319 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:28:02.480314 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53\": container with ID starting with 1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53 not found: ID does not exist" containerID="1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53" Apr 17 17:28:02.480448 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480333 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53"} err="failed to get container status \"1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53\": rpc error: code = NotFound desc = could not find container \"1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53\": container with ID starting with 1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53 not found: ID does not exist" Apr 17 17:28:02.480448 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480346 2580 scope.go:117] "RemoveContainer" containerID="27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40" Apr 17 17:28:02.480654 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:28:02.480633 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40\": container with ID starting with 27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40 not found: ID does not exist" containerID="27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40" Apr 17 17:28:02.480717 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480660 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40"} err="failed to get container status \"27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40\": rpc error: code = NotFound desc = could not find container \"27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40\": container with ID starting with 27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40 not found: ID does not exist" Apr 17 17:28:02.480717 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480677 2580 scope.go:117] "RemoveContainer" containerID="942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7" Apr 17 17:28:02.480791 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480761 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="alertmanager" Apr 17 17:28:02.480791 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480773 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="alertmanager" Apr 17 17:28:02.480879 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480790 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="init-config-reloader" Apr 17 17:28:02.480879 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480798 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="init-config-reloader" Apr 17 17:28:02.480879 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480814 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="config-reloader" Apr 17 17:28:02.480879 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480824 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="config-reloader" Apr 17 17:28:02.480879 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480834 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy" Apr 17 17:28:02.480879 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480842 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy" Apr 17 17:28:02.480879 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480854 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="prom-label-proxy" Apr 17 17:28:02.480879 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480863 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="prom-label-proxy" Apr 17 17:28:02.480879 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480874 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy-metric" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480883 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy-metric" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480883 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7"} err="failed to get container status \"942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7\": rpc error: code = NotFound desc = could not find container \"942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7\": container with ID starting with 942a6f8cfd45ea987aa8fc7e93f8b7554051bae0e2c0ce93458fff81bdf880a7 not found: ID does not exist" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480900 2580 scope.go:117] "RemoveContainer" containerID="0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480901 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy-web" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.480937 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy-web" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481004 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="prom-label-proxy" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481013 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="alertmanager" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481021 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="config-reloader" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481029 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy-web" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481036 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481042 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" containerName="kube-rbac-proxy-metric" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481139 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355"} err="failed to get container status \"0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355\": rpc error: code = NotFound desc = could not find container \"0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355\": container with ID starting with 0abcebd49b95a40443078b38e8d545547cfb790a3c230e2d57165dd9ca4f6355 not found: ID does not exist" Apr 17 17:28:02.481188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481164 2580 scope.go:117] "RemoveContainer" containerID="3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb" Apr 17 17:28:02.481689 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481405 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb"} err="failed to get container status \"3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb\": rpc error: code = NotFound desc = could not find container \"3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb\": container with ID starting with 3f030a59215928a9dcdd4c83b5b2a71129ffcc6a1c943067e177253bee7db2eb not found: ID does not exist" Apr 17 17:28:02.481689 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481422 2580 scope.go:117] "RemoveContainer" containerID="8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5" Apr 17 17:28:02.481689 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481674 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5"} err="failed to get container status \"8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5\": rpc error: code = NotFound desc = could not find container \"8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5\": container with ID starting with 8f48e7b03e217097d0ea83b458b3e0d25cba7b5d380edc534679f286942f1fd5 not found: ID does not exist" Apr 17 17:28:02.481689 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481689 2580 scope.go:117] "RemoveContainer" containerID="409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d" Apr 17 17:28:02.481938 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481919 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d"} err="failed to get container status \"409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d\": rpc error: code = NotFound desc = could not find container \"409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d\": container with ID starting with 409f222fa6379924342ecc911ff66627c8321f60d217aa5e345df8f6009f542d not found: ID does not exist" Apr 17 17:28:02.482008 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.481940 2580 scope.go:117] "RemoveContainer" containerID="1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53" Apr 17 17:28:02.482224 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.482173 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53"} err="failed to get container status \"1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53\": rpc error: code = NotFound desc = could not find container \"1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53\": container with ID starting with 1a734b28fec8115302c3a8297b5728a4f620539efbb86c3e9467654a39c10e53 not found: ID does not exist" Apr 17 17:28:02.482296 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.482225 2580 scope.go:117] "RemoveContainer" containerID="27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40" Apr 17 17:28:02.482448 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.482431 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40"} err="failed to get container status \"27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40\": rpc error: code = NotFound desc = could not find container \"27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40\": container with ID starting with 27e5b14bb186d7ab83feb59a7792b218fee7955f99867c362d2b8ebf7d195f40 not found: ID does not exist" Apr 17 17:28:02.485044 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.485029 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.487871 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.487841 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 17:28:02.487969 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.487841 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 17:28:02.487969 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.487943 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 17:28:02.488066 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.487842 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 17:28:02.488125 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.488107 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 17:28:02.488177 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.488159 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 17:28:02.488287 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.488273 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 17:28:02.488386 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.488371 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-z4rfh\"" Apr 17 17:28:02.488673 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.488658 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 17:28:02.493202 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.493180 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 17:28:02.497628 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.497605 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:28:02.640419 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.640345 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1da773b6-e4ed-4f95-b2b0-665baf696140" path="/var/lib/kubelet/pods/1da773b6-e4ed-4f95-b2b0-665baf696140/volumes" Apr 17 17:28:02.641851 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.641830 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-config-volume\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.641962 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.641870 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.641962 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.641900 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.641962 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.641936 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a04f1967-3898-40ca-9ed7-804412fa3235-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.642129 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.641985 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a04f1967-3898-40ca-9ed7-804412fa3235-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.642129 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.642019 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a04f1967-3898-40ca-9ed7-804412fa3235-config-out\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.642129 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.642053 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6x5c\" (UniqueName: \"kubernetes.io/projected/a04f1967-3898-40ca-9ed7-804412fa3235-kube-api-access-z6x5c\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.642129 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.642111 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.642246 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.642146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.642246 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.642164 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a04f1967-3898-40ca-9ed7-804412fa3235-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.642246 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.642181 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.642246 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.642225 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-web-config\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.642246 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.642241 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a04f1967-3898-40ca-9ed7-804412fa3235-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.742641 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.742571 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.742641 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.742643 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a04f1967-3898-40ca-9ed7-804412fa3235-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.742841 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.742667 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.742841 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.742699 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-web-config\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.742841 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.742716 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a04f1967-3898-40ca-9ed7-804412fa3235-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.742841 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.742757 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-config-volume\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.743036 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.742897 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.743036 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.742946 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.743036 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.742981 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a04f1967-3898-40ca-9ed7-804412fa3235-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.743036 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.743031 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a04f1967-3898-40ca-9ed7-804412fa3235-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.743218 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.743059 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a04f1967-3898-40ca-9ed7-804412fa3235-config-out\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.743218 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.743114 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6x5c\" (UniqueName: \"kubernetes.io/projected/a04f1967-3898-40ca-9ed7-804412fa3235-kube-api-access-z6x5c\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.743218 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.743181 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.744336 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.743807 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a04f1967-3898-40ca-9ed7-804412fa3235-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.744336 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.744040 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a04f1967-3898-40ca-9ed7-804412fa3235-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.746046 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.745677 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-web-config\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.746046 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.745677 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.746046 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.745760 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.746046 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.745822 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-config-volume\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.746046 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.745837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.746411 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.746388 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.746467 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.746397 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a04f1967-3898-40ca-9ed7-804412fa3235-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.746673 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.746652 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a04f1967-3898-40ca-9ed7-804412fa3235-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.746768 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.746754 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a04f1967-3898-40ca-9ed7-804412fa3235-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.747456 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.747435 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a04f1967-3898-40ca-9ed7-804412fa3235-config-out\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.753970 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.753949 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6x5c\" (UniqueName: \"kubernetes.io/projected/a04f1967-3898-40ca-9ed7-804412fa3235-kube-api-access-z6x5c\") pod \"alertmanager-main-0\" (UID: \"a04f1967-3898-40ca-9ed7-804412fa3235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.795328 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.795297 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:28:02.926894 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:02.925206 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:28:02.928234 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:28:02.928194 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda04f1967_3898_40ca_9ed7_804412fa3235.slice/crio-f2c0b530b062789dd4567327afa46477d67215b3f05477049430550f0dd5d865 WatchSource:0}: Error finding container f2c0b530b062789dd4567327afa46477d67215b3f05477049430550f0dd5d865: Status 404 returned error can't find the container with id f2c0b530b062789dd4567327afa46477d67215b3f05477049430550f0dd5d865 Apr 17 17:28:03.423154 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:03.423118 2580 generic.go:358] "Generic (PLEG): container finished" podID="a04f1967-3898-40ca-9ed7-804412fa3235" containerID="96b1fe928ff7fa4a54ce6ef4ca27ab1482196a7faa472c5c8f6862ce1a3e3150" exitCode=0 Apr 17 17:28:03.423597 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:03.423196 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a04f1967-3898-40ca-9ed7-804412fa3235","Type":"ContainerDied","Data":"96b1fe928ff7fa4a54ce6ef4ca27ab1482196a7faa472c5c8f6862ce1a3e3150"} Apr 17 17:28:03.423597 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:03.423223 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a04f1967-3898-40ca-9ed7-804412fa3235","Type":"ContainerStarted","Data":"f2c0b530b062789dd4567327afa46477d67215b3f05477049430550f0dd5d865"} Apr 17 17:28:04.429505 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:04.429471 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a04f1967-3898-40ca-9ed7-804412fa3235","Type":"ContainerStarted","Data":"c6af3e16c7e1d371dbffa4ee35e0f7e32b65d50a5e5f5b29e91e510336bd526a"} Apr 17 17:28:04.429505 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:04.429507 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a04f1967-3898-40ca-9ed7-804412fa3235","Type":"ContainerStarted","Data":"32d5671e646d1a7bf13fe33ca357cac3533c786ef917dfec09c8db67be1a6dd9"} Apr 17 17:28:04.430386 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:04.429518 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a04f1967-3898-40ca-9ed7-804412fa3235","Type":"ContainerStarted","Data":"e7f03a62ddfc8fa0020e434ca24c01625e39866406390dd2a5c50134c6b7bebd"} Apr 17 17:28:04.430386 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:04.429526 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a04f1967-3898-40ca-9ed7-804412fa3235","Type":"ContainerStarted","Data":"bc19fd2594cc75b891abe60a6ad54003d28c949aebab8486415bc79b5fbb7753"} Apr 17 17:28:04.430386 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:04.429536 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a04f1967-3898-40ca-9ed7-804412fa3235","Type":"ContainerStarted","Data":"88e39c2565a6f9cf24ef20590a8f8fa82c1fb5174556e82dd235b303c46f4d13"} Apr 17 17:28:04.430386 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:04.429544 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a04f1967-3898-40ca-9ed7-804412fa3235","Type":"ContainerStarted","Data":"62dfb2aa2d11dcceee3aeb4a763890ad356f80392b57f3d4299c713b2b228114"} Apr 17 17:28:04.469355 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:04.469294 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.469274132 podStartE2EDuration="2.469274132s" podCreationTimestamp="2026-04-17 17:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:28:04.468356616 +0000 UTC m=+250.439639958" watchObservedRunningTime="2026-04-17 17:28:04.469274132 +0000 UTC m=+250.440557473" Apr 17 17:28:06.373655 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:06.373601 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:28:06.375862 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:06.375843 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1227f475-d747-4720-ad95-d72a46d6d1fb-metrics-certs\") pod \"network-metrics-daemon-knvfd\" (UID: \"1227f475-d747-4720-ad95-d72a46d6d1fb\") " pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:28:06.640286 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:06.640201 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7c8rp\"" Apr 17 17:28:06.647811 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:06.647789 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-knvfd" Apr 17 17:28:06.779454 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:06.779424 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-knvfd"] Apr 17 17:28:06.782686 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:28:06.782644 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1227f475_d747_4720_ad95_d72a46d6d1fb.slice/crio-71799b6547aaf087faf83048edbedc1e4805516926ce79cbe9701ed4464a5fb2 WatchSource:0}: Error finding container 71799b6547aaf087faf83048edbedc1e4805516926ce79cbe9701ed4464a5fb2: Status 404 returned error can't find the container with id 71799b6547aaf087faf83048edbedc1e4805516926ce79cbe9701ed4464a5fb2 Apr 17 17:28:07.440127 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:07.440085 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-knvfd" event={"ID":"1227f475-d747-4720-ad95-d72a46d6d1fb","Type":"ContainerStarted","Data":"71799b6547aaf087faf83048edbedc1e4805516926ce79cbe9701ed4464a5fb2"} Apr 17 17:28:08.446057 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:08.445973 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-knvfd" event={"ID":"1227f475-d747-4720-ad95-d72a46d6d1fb","Type":"ContainerStarted","Data":"34c1a346c358eee68707781e9b6a02281b01ff80dce9d14d7b4d5583a75eb627"} Apr 17 17:28:08.446057 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:08.446019 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-knvfd" event={"ID":"1227f475-d747-4720-ad95-d72a46d6d1fb","Type":"ContainerStarted","Data":"31d159301e8ca1bec3695c3b4e5384012a09ae2a68398ec25c3b69a6d841a5f7"} Apr 17 17:28:08.465900 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:08.465846 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-knvfd" podStartSLOduration=253.601025552 podStartE2EDuration="4m14.465825486s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:28:06.784559844 +0000 UTC m=+252.755843163" lastFinishedPulling="2026-04-17 17:28:07.649359777 +0000 UTC m=+253.620643097" observedRunningTime="2026-04-17 17:28:08.464432966 +0000 UTC m=+254.435716308" watchObservedRunningTime="2026-04-17 17:28:08.465825486 +0000 UTC m=+254.437108827" Apr 17 17:28:24.067771 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:24.067689 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-576f4646d-j9plt"] Apr 17 17:28:49.087527 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.087482 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-576f4646d-j9plt" podUID="53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" containerName="console" containerID="cri-o://c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2" gracePeriod=15 Apr 17 17:28:49.329652 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.329626 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-576f4646d-j9plt_53aa45fa-1f1d-4677-a7e1-d6d47ec78efa/console/0.log" Apr 17 17:28:49.329791 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.329690 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:28:49.421312 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.421221 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-serving-cert\") pod \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " Apr 17 17:28:49.421312 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.421279 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-config\") pod \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " Apr 17 17:28:49.421541 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.421314 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-oauth-serving-cert\") pod \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " Apr 17 17:28:49.421541 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.421344 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-oauth-config\") pod \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " Apr 17 17:28:49.421541 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.421371 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thptk\" (UniqueName: \"kubernetes.io/projected/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-kube-api-access-thptk\") pod \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " Apr 17 17:28:49.421541 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.421426 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-trusted-ca-bundle\") pod \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " Apr 17 17:28:49.421541 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.421458 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-service-ca\") pod \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\" (UID: \"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa\") " Apr 17 17:28:49.421935 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.421905 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-config" (OuterVolumeSpecName: "console-config") pod "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" (UID: "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:49.422042 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.421937 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" (UID: "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:49.422042 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.421985 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" (UID: "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:49.422042 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.422031 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-service-ca" (OuterVolumeSpecName: "service-ca") pod "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" (UID: "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:49.423670 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.423645 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" (UID: "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:49.423752 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.423666 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" (UID: "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:49.423752 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.423740 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-kube-api-access-thptk" (OuterVolumeSpecName: "kube-api-access-thptk") pod "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" (UID: "53aa45fa-1f1d-4677-a7e1-d6d47ec78efa"). InnerVolumeSpecName "kube-api-access-thptk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:49.522541 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.522502 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:49.522541 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.522534 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-oauth-serving-cert\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:49.522541 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.522544 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-oauth-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:49.522795 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.522554 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-thptk\" (UniqueName: \"kubernetes.io/projected/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-kube-api-access-thptk\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:49.522795 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.522564 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-trusted-ca-bundle\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:49.522795 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.522573 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-service-ca\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:49.522795 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.522603 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa-console-serving-cert\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:28:49.570967 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.570942 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-576f4646d-j9plt_53aa45fa-1f1d-4677-a7e1-d6d47ec78efa/console/0.log" Apr 17 17:28:49.571120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.570981 2580 generic.go:358] "Generic (PLEG): container finished" podID="53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" containerID="c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2" exitCode=2 Apr 17 17:28:49.571120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.571074 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576f4646d-j9plt" Apr 17 17:28:49.571202 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.571068 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576f4646d-j9plt" event={"ID":"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa","Type":"ContainerDied","Data":"c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2"} Apr 17 17:28:49.571202 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.571178 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576f4646d-j9plt" event={"ID":"53aa45fa-1f1d-4677-a7e1-d6d47ec78efa","Type":"ContainerDied","Data":"2a4b8f8d2b9febb422b0215da4d9074ced3a4f97a46c7ace64931f38629c8b74"} Apr 17 17:28:49.571202 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.571195 2580 scope.go:117] "RemoveContainer" containerID="c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2" Apr 17 17:28:49.579222 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.579204 2580 scope.go:117] "RemoveContainer" containerID="c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2" Apr 17 17:28:49.579467 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:28:49.579449 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2\": container with ID starting with c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2 not found: ID does not exist" containerID="c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2" Apr 17 17:28:49.579539 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.579477 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2"} err="failed to get container status \"c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2\": rpc error: code = NotFound desc = could not find container \"c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2\": container with ID starting with c9ad866482a76a46890f7aca72b005bd7093d5ef9ee2e7a13971c2b850a8e2a2 not found: ID does not exist" Apr 17 17:28:49.591927 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.591904 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-576f4646d-j9plt"] Apr 17 17:28:49.595833 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:49.595813 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-576f4646d-j9plt"] Apr 17 17:28:50.640415 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:50.640380 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" path="/var/lib/kubelet/pods/53aa45fa-1f1d-4677-a7e1-d6d47ec78efa/volumes" Apr 17 17:28:54.528833 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:54.528806 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:28:54.529612 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:54.529570 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:28:54.535993 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:28:54.535970 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:29:31.220480 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.220446 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw"] Apr 17 17:29:31.220898 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.220800 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" containerName="console" Apr 17 17:29:31.220898 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.220813 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" containerName="console" Apr 17 17:29:31.220898 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.220867 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="53aa45fa-1f1d-4677-a7e1-d6d47ec78efa" containerName="console" Apr 17 17:29:31.223677 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.223661 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:31.227155 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.227134 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:29:31.227255 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.227193 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:29:31.228337 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.228323 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2xqm\"" Apr 17 17:29:31.233240 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.233214 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw"] Apr 17 17:29:31.356705 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.356648 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svc4q\" (UniqueName: \"kubernetes.io/projected/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-kube-api-access-svc4q\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:31.356893 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.356730 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:31.356893 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.356806 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:31.457709 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.457613 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:31.457709 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.457661 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svc4q\" (UniqueName: \"kubernetes.io/projected/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-kube-api-access-svc4q\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:31.457709 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.457708 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:31.457973 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.457956 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:31.458019 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.458001 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:31.471568 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.471538 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svc4q\" (UniqueName: \"kubernetes.io/projected/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-kube-api-access-svc4q\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:31.533149 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.533109 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:31.661922 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.661888 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw"] Apr 17 17:29:31.665938 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:29:31.665909 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ea9c181_bca4_4d16_8ac5_a02e47d4cc6b.slice/crio-a4bd1e87e01f94478268aeccece768978c65e613043aff5819a6e4d986c7e6d3 WatchSource:0}: Error finding container a4bd1e87e01f94478268aeccece768978c65e613043aff5819a6e4d986c7e6d3: Status 404 returned error can't find the container with id a4bd1e87e01f94478268aeccece768978c65e613043aff5819a6e4d986c7e6d3 Apr 17 17:29:31.667709 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.667691 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:29:31.688095 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:31.688053 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" event={"ID":"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b","Type":"ContainerStarted","Data":"a4bd1e87e01f94478268aeccece768978c65e613043aff5819a6e4d986c7e6d3"} Apr 17 17:29:37.707993 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:37.707956 2580 generic.go:358] "Generic (PLEG): container finished" podID="7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" containerID="817f474af039efadf6efa4e9c7ff64b05c64aaad805f137c2f6841ac27a531b3" exitCode=0 Apr 17 17:29:37.708384 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:37.708017 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" event={"ID":"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b","Type":"ContainerDied","Data":"817f474af039efadf6efa4e9c7ff64b05c64aaad805f137c2f6841ac27a531b3"} Apr 17 17:29:39.715517 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:39.715487 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" event={"ID":"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b","Type":"ContainerStarted","Data":"d726c9d0e22e2d992f73fa496cc34b70933ea651e3bfe7b69b85df133bf90679"} Apr 17 17:29:40.720867 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:40.720830 2580 generic.go:358] "Generic (PLEG): container finished" podID="7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" containerID="d726c9d0e22e2d992f73fa496cc34b70933ea651e3bfe7b69b85df133bf90679" exitCode=0 Apr 17 17:29:40.721271 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:40.720916 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" event={"ID":"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b","Type":"ContainerDied","Data":"d726c9d0e22e2d992f73fa496cc34b70933ea651e3bfe7b69b85df133bf90679"} Apr 17 17:29:47.743323 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:47.743283 2580 generic.go:358] "Generic (PLEG): container finished" podID="7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" containerID="5dce87c8f85a2145eecd67ce28a24fdb0ed99e0162d1ab92ff9ffba5293f138c" exitCode=0 Apr 17 17:29:47.743806 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:47.743370 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" event={"ID":"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b","Type":"ContainerDied","Data":"5dce87c8f85a2145eecd67ce28a24fdb0ed99e0162d1ab92ff9ffba5293f138c"} Apr 17 17:29:48.863248 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:48.863225 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:48.891460 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:48.891433 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-util\") pod \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " Apr 17 17:29:48.891616 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:48.891500 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svc4q\" (UniqueName: \"kubernetes.io/projected/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-kube-api-access-svc4q\") pod \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " Apr 17 17:29:48.891616 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:48.891549 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-bundle\") pod \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\" (UID: \"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b\") " Apr 17 17:29:48.892313 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:48.892287 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-bundle" (OuterVolumeSpecName: "bundle") pod "7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" (UID: "7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:29:48.894945 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:48.894915 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-kube-api-access-svc4q" (OuterVolumeSpecName: "kube-api-access-svc4q") pod "7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" (UID: "7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b"). InnerVolumeSpecName "kube-api-access-svc4q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:29:48.896860 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:48.896826 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-util" (OuterVolumeSpecName: "util") pod "7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" (UID: "7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:29:48.992730 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:48.992692 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-bundle\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:29:48.992730 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:48.992722 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-util\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:29:48.992730 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:48.992731 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-svc4q\" (UniqueName: \"kubernetes.io/projected/7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b-kube-api-access-svc4q\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:29:49.750134 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:49.750105 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" Apr 17 17:29:49.750297 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:49.750104 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c2btbw" event={"ID":"7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b","Type":"ContainerDied","Data":"a4bd1e87e01f94478268aeccece768978c65e613043aff5819a6e4d986c7e6d3"} Apr 17 17:29:49.750297 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:49.750211 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4bd1e87e01f94478268aeccece768978c65e613043aff5819a6e4d986c7e6d3" Apr 17 17:29:57.388287 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.388259 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-kdb5m"] Apr 17 17:29:57.388689 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.388543 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" containerName="extract" Apr 17 17:29:57.388689 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.388554 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" containerName="extract" Apr 17 17:29:57.388689 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.388567 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" containerName="util" Apr 17 17:29:57.388689 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.388573 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" containerName="util" Apr 17 17:29:57.388689 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.388602 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" containerName="pull" Apr 17 17:29:57.388689 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.388608 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" containerName="pull" Apr 17 17:29:57.388689 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.388656 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ea9c181-bca4-4d16-8ac5-a02e47d4cc6b" containerName="extract" Apr 17 17:29:57.442186 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.442155 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-kdb5m"] Apr 17 17:29:57.442363 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.442278 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:29:57.445009 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.444985 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 17:29:57.445158 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.444986 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 17:29:57.445158 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.445056 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 17:29:57.445158 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.445103 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 17:29:57.446077 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.446055 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 17:29:57.446187 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.446112 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-d9wx9\"" Apr 17 17:29:57.560160 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.560118 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:29:57.560360 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.560190 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7f2h\" (UniqueName: \"kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-kube-api-access-s7f2h\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:29:57.560360 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.560250 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/22b31e6b-6e56-49f4-82bf-45a32a95708c-cabundle0\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:29:57.660918 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.660829 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:29:57.660918 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.660881 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7f2h\" (UniqueName: \"kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-kube-api-access-s7f2h\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:29:57.660918 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.660906 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/22b31e6b-6e56-49f4-82bf-45a32a95708c-cabundle0\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:29:57.661205 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:57.660986 2580 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:29:57.661205 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:57.661013 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:29:57.661205 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:57.661027 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-kdb5m: references non-existent secret key: ca.crt Apr 17 17:29:57.661205 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:57.661105 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates podName:22b31e6b-6e56-49f4-82bf-45a32a95708c nodeName:}" failed. No retries permitted until 2026-04-17 17:29:58.161082583 +0000 UTC m=+364.132365905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates") pod "keda-operator-ffbb595cb-kdb5m" (UID: "22b31e6b-6e56-49f4-82bf-45a32a95708c") : references non-existent secret key: ca.crt Apr 17 17:29:57.661510 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.661493 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/22b31e6b-6e56-49f4-82bf-45a32a95708c-cabundle0\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:29:57.673210 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.673179 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7f2h\" (UniqueName: \"kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-kube-api-access-s7f2h\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:29:57.707924 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.707891 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb"] Apr 17 17:29:57.733251 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.733217 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb"] Apr 17 17:29:57.733423 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.733372 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:29:57.736293 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.736270 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 17:29:57.862064 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.862022 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:29:57.862259 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.862091 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:29:57.862259 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.862127 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hswp\" (UniqueName: \"kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-kube-api-access-8hswp\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:29:57.963164 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.963068 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hswp\" (UniqueName: \"kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-kube-api-access-8hswp\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:29:57.963322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.963193 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:29:57.963322 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.963239 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:29:57.963441 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:57.963372 2580 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:29:57.963441 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:57.963394 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:29:57.963441 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:57.963412 2580 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 17 17:29:57.963441 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:57.963432 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 17:29:57.963679 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:57.963490 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates podName:d317771c-56ed-4ad1-aa3c-d7ce861fa7f0 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:58.463471737 +0000 UTC m=+364.434755067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates") pod "keda-metrics-apiserver-7c9f485588-cvmzb" (UID: "d317771c-56ed-4ad1-aa3c-d7ce861fa7f0") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 17:29:57.963679 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.963658 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:29:57.972701 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:57.972667 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hswp\" (UniqueName: \"kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-kube-api-access-8hswp\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:29:58.165675 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:58.165636 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:29:58.165836 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:58.165789 2580 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:29:58.165836 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:58.165808 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:29:58.165836 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:58.165818 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-kdb5m: references non-existent secret key: ca.crt Apr 17 17:29:58.165937 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:58.165869 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates podName:22b31e6b-6e56-49f4-82bf-45a32a95708c nodeName:}" failed. No retries permitted until 2026-04-17 17:29:59.165853975 +0000 UTC m=+365.137137300 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates") pod "keda-operator-ffbb595cb-kdb5m" (UID: "22b31e6b-6e56-49f4-82bf-45a32a95708c") : references non-existent secret key: ca.crt Apr 17 17:29:58.468712 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:58.468675 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:29:58.469074 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:58.468814 2580 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:29:58.469074 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:58.468832 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:29:58.469074 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:58.468852 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb: references non-existent secret key: tls.crt Apr 17 17:29:58.469074 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:58.468905 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates podName:d317771c-56ed-4ad1-aa3c-d7ce861fa7f0 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:59.468891079 +0000 UTC m=+365.440174403 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates") pod "keda-metrics-apiserver-7c9f485588-cvmzb" (UID: "d317771c-56ed-4ad1-aa3c-d7ce861fa7f0") : references non-existent secret key: tls.crt Apr 17 17:29:59.176120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:59.176078 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:29:59.176343 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:59.176226 2580 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:29:59.176343 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:59.176248 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:29:59.176343 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:59.176257 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-kdb5m: references non-existent secret key: ca.crt Apr 17 17:29:59.176343 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:59.176315 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates podName:22b31e6b-6e56-49f4-82bf-45a32a95708c nodeName:}" failed. No retries permitted until 2026-04-17 17:30:01.176301293 +0000 UTC m=+367.147584619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates") pod "keda-operator-ffbb595cb-kdb5m" (UID: "22b31e6b-6e56-49f4-82bf-45a32a95708c") : references non-existent secret key: ca.crt Apr 17 17:29:59.478563 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:29:59.478472 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:29:59.478933 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:59.478646 2580 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:29:59.478933 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:59.478667 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:29:59.478933 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:59.478684 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb: references non-existent secret key: tls.crt Apr 17 17:29:59.478933 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:29:59.478738 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates podName:d317771c-56ed-4ad1-aa3c-d7ce861fa7f0 nodeName:}" failed. No retries permitted until 2026-04-17 17:30:01.47872402 +0000 UTC m=+367.450007339 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates") pod "keda-metrics-apiserver-7c9f485588-cvmzb" (UID: "d317771c-56ed-4ad1-aa3c-d7ce861fa7f0") : references non-existent secret key: tls.crt Apr 17 17:30:01.193914 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:01.193870 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:30:01.194405 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:30:01.194034 2580 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:30:01.194405 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:30:01.194061 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:30:01.194405 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:30:01.194076 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-kdb5m: references non-existent secret key: ca.crt Apr 17 17:30:01.194405 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:30:01.194147 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates podName:22b31e6b-6e56-49f4-82bf-45a32a95708c nodeName:}" failed. No retries permitted until 2026-04-17 17:30:05.194127802 +0000 UTC m=+371.165411126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates") pod "keda-operator-ffbb595cb-kdb5m" (UID: "22b31e6b-6e56-49f4-82bf-45a32a95708c") : references non-existent secret key: ca.crt Apr 17 17:30:01.497168 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:01.497076 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:30:01.497313 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:30:01.497235 2580 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:30:01.497313 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:30:01.497253 2580 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:30:01.497313 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:30:01.497269 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb: references non-existent secret key: tls.crt Apr 17 17:30:01.497423 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:30:01.497320 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates podName:d317771c-56ed-4ad1-aa3c-d7ce861fa7f0 nodeName:}" failed. No retries permitted until 2026-04-17 17:30:05.497301135 +0000 UTC m=+371.468584456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates") pod "keda-metrics-apiserver-7c9f485588-cvmzb" (UID: "d317771c-56ed-4ad1-aa3c-d7ce861fa7f0") : references non-existent secret key: tls.crt Apr 17 17:30:05.228929 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:05.228889 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:30:05.231413 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:05.231387 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/22b31e6b-6e56-49f4-82bf-45a32a95708c-certificates\") pod \"keda-operator-ffbb595cb-kdb5m\" (UID: \"22b31e6b-6e56-49f4-82bf-45a32a95708c\") " pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:30:05.252456 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:05.252406 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:30:05.370416 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:05.370367 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-kdb5m"] Apr 17 17:30:05.372502 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:30:05.372459 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22b31e6b_6e56_49f4_82bf_45a32a95708c.slice/crio-ef9555ad725230025e6d5ff3b091af3c82d99909ffc665ff565d3710ea194cef WatchSource:0}: Error finding container ef9555ad725230025e6d5ff3b091af3c82d99909ffc665ff565d3710ea194cef: Status 404 returned error can't find the container with id ef9555ad725230025e6d5ff3b091af3c82d99909ffc665ff565d3710ea194cef Apr 17 17:30:05.532362 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:05.532270 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:30:05.534966 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:05.534930 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d317771c-56ed-4ad1-aa3c-d7ce861fa7f0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cvmzb\" (UID: \"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:30:05.543551 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:05.543525 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:30:05.666676 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:05.666607 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb"] Apr 17 17:30:05.668711 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:30:05.668682 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd317771c_56ed_4ad1_aa3c_d7ce861fa7f0.slice/crio-02495e1e66fea17fc0175600f92008ad2ee886abff130b78d083a29bb37aaef4 WatchSource:0}: Error finding container 02495e1e66fea17fc0175600f92008ad2ee886abff130b78d083a29bb37aaef4: Status 404 returned error can't find the container with id 02495e1e66fea17fc0175600f92008ad2ee886abff130b78d083a29bb37aaef4 Apr 17 17:30:05.800817 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:05.800726 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" event={"ID":"22b31e6b-6e56-49f4-82bf-45a32a95708c","Type":"ContainerStarted","Data":"ef9555ad725230025e6d5ff3b091af3c82d99909ffc665ff565d3710ea194cef"} Apr 17 17:30:05.801677 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:05.801655 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" event={"ID":"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0","Type":"ContainerStarted","Data":"02495e1e66fea17fc0175600f92008ad2ee886abff130b78d083a29bb37aaef4"} Apr 17 17:30:11.821431 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:11.821386 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" event={"ID":"22b31e6b-6e56-49f4-82bf-45a32a95708c","Type":"ContainerStarted","Data":"f4ab1f4dab3295d8415a0536e8796d028a3b43244c4d56de1bf94d404632d645"} Apr 17 17:30:11.821929 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:11.821517 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:30:11.822744 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:11.822717 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" event={"ID":"d317771c-56ed-4ad1-aa3c-d7ce861fa7f0","Type":"ContainerStarted","Data":"70c42a1596b1f77676be8109ec96f4043ab1fbdaf08c06e198c24453e1e454a2"} Apr 17 17:30:11.822907 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:11.822889 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:30:11.839227 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:11.839168 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" podStartSLOduration=9.166854967 podStartE2EDuration="14.839153087s" podCreationTimestamp="2026-04-17 17:29:57 +0000 UTC" firstStartedPulling="2026-04-17 17:30:05.37380955 +0000 UTC m=+371.345092870" lastFinishedPulling="2026-04-17 17:30:11.046107663 +0000 UTC m=+377.017390990" observedRunningTime="2026-04-17 17:30:11.838158906 +0000 UTC m=+377.809442535" watchObservedRunningTime="2026-04-17 17:30:11.839153087 +0000 UTC m=+377.810436427" Apr 17 17:30:11.858445 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:11.858380 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" podStartSLOduration=9.486635956 podStartE2EDuration="14.858357545s" podCreationTimestamp="2026-04-17 17:29:57 +0000 UTC" firstStartedPulling="2026-04-17 17:30:05.670047319 +0000 UTC m=+371.641330638" lastFinishedPulling="2026-04-17 17:30:11.041768905 +0000 UTC m=+377.013052227" observedRunningTime="2026-04-17 17:30:11.856787719 +0000 UTC m=+377.828071061" watchObservedRunningTime="2026-04-17 17:30:11.858357545 +0000 UTC m=+377.829640887" Apr 17 17:30:22.831386 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:22.831348 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cvmzb" Apr 17 17:30:32.828260 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:30:32.828226 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-kdb5m" Apr 17 17:31:06.478457 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.478422 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv"] Apr 17 17:31:06.481089 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.481072 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" Apr 17 17:31:06.483749 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.483725 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 17:31:06.483749 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.483726 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 17:31:06.483940 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.483823 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-cgh4w\"" Apr 17 17:31:06.484652 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.484637 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 17:31:06.491039 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.491017 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv"] Apr 17 17:31:06.499798 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.499768 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-vm8r6"] Apr 17 17:31:06.502650 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.502627 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-vm8r6" Apr 17 17:31:06.505470 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.505445 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 17:31:06.505470 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.505445 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vg6fs\"" Apr 17 17:31:06.513416 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.513390 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-vm8r6"] Apr 17 17:31:06.625814 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.625774 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/516ed833-a365-4252-945b-a1f54e70350b-data\") pod \"seaweedfs-86cc847c5c-vm8r6\" (UID: \"516ed833-a365-4252-945b-a1f54e70350b\") " pod="kserve/seaweedfs-86cc847c5c-vm8r6" Apr 17 17:31:06.625989 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.625832 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a50faf80-a870-4209-9a73-8dc84fd00c4b-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-fkxfv\" (UID: \"a50faf80-a870-4209-9a73-8dc84fd00c4b\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" Apr 17 17:31:06.625989 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.625854 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkwxp\" (UniqueName: \"kubernetes.io/projected/a50faf80-a870-4209-9a73-8dc84fd00c4b-kube-api-access-nkwxp\") pod \"llmisvc-controller-manager-68cc5db7c4-fkxfv\" (UID: \"a50faf80-a870-4209-9a73-8dc84fd00c4b\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" Apr 17 17:31:06.625989 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.625876 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99z8k\" (UniqueName: \"kubernetes.io/projected/516ed833-a365-4252-945b-a1f54e70350b-kube-api-access-99z8k\") pod \"seaweedfs-86cc847c5c-vm8r6\" (UID: \"516ed833-a365-4252-945b-a1f54e70350b\") " pod="kserve/seaweedfs-86cc847c5c-vm8r6" Apr 17 17:31:06.726287 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.726248 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a50faf80-a870-4209-9a73-8dc84fd00c4b-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-fkxfv\" (UID: \"a50faf80-a870-4209-9a73-8dc84fd00c4b\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" Apr 17 17:31:06.726287 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.726287 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkwxp\" (UniqueName: \"kubernetes.io/projected/a50faf80-a870-4209-9a73-8dc84fd00c4b-kube-api-access-nkwxp\") pod \"llmisvc-controller-manager-68cc5db7c4-fkxfv\" (UID: \"a50faf80-a870-4209-9a73-8dc84fd00c4b\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" Apr 17 17:31:06.726508 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.726304 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99z8k\" (UniqueName: \"kubernetes.io/projected/516ed833-a365-4252-945b-a1f54e70350b-kube-api-access-99z8k\") pod \"seaweedfs-86cc847c5c-vm8r6\" (UID: \"516ed833-a365-4252-945b-a1f54e70350b\") " pod="kserve/seaweedfs-86cc847c5c-vm8r6" Apr 17 17:31:06.726508 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.726366 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/516ed833-a365-4252-945b-a1f54e70350b-data\") pod \"seaweedfs-86cc847c5c-vm8r6\" (UID: \"516ed833-a365-4252-945b-a1f54e70350b\") " pod="kserve/seaweedfs-86cc847c5c-vm8r6" Apr 17 17:31:06.726740 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.726721 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/516ed833-a365-4252-945b-a1f54e70350b-data\") pod \"seaweedfs-86cc847c5c-vm8r6\" (UID: \"516ed833-a365-4252-945b-a1f54e70350b\") " pod="kserve/seaweedfs-86cc847c5c-vm8r6" Apr 17 17:31:06.728738 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.728681 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a50faf80-a870-4209-9a73-8dc84fd00c4b-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-fkxfv\" (UID: \"a50faf80-a870-4209-9a73-8dc84fd00c4b\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" Apr 17 17:31:06.735253 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.735227 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99z8k\" (UniqueName: \"kubernetes.io/projected/516ed833-a365-4252-945b-a1f54e70350b-kube-api-access-99z8k\") pod \"seaweedfs-86cc847c5c-vm8r6\" (UID: \"516ed833-a365-4252-945b-a1f54e70350b\") " pod="kserve/seaweedfs-86cc847c5c-vm8r6" Apr 17 17:31:06.735380 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.735354 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkwxp\" (UniqueName: \"kubernetes.io/projected/a50faf80-a870-4209-9a73-8dc84fd00c4b-kube-api-access-nkwxp\") pod \"llmisvc-controller-manager-68cc5db7c4-fkxfv\" (UID: \"a50faf80-a870-4209-9a73-8dc84fd00c4b\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" Apr 17 17:31:06.793168 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.793124 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" Apr 17 17:31:06.815020 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.814990 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-vm8r6" Apr 17 17:31:06.947928 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.947904 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv"] Apr 17 17:31:06.950629 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:31:06.950598 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda50faf80_a870_4209_9a73_8dc84fd00c4b.slice/crio-f96e2780f5048d8a23518b0161d24c64f8d223630a742d5cd0251deb1d02f7ed WatchSource:0}: Error finding container f96e2780f5048d8a23518b0161d24c64f8d223630a742d5cd0251deb1d02f7ed: Status 404 returned error can't find the container with id f96e2780f5048d8a23518b0161d24c64f8d223630a742d5cd0251deb1d02f7ed Apr 17 17:31:06.976708 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.976677 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-vm8r6"] Apr 17 17:31:06.980788 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:31:06.980722 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod516ed833_a365_4252_945b_a1f54e70350b.slice/crio-17be75099efdb218b1bd79db7825a183c4be862dae38a6d09b0299b44dd3555d WatchSource:0}: Error finding container 17be75099efdb218b1bd79db7825a183c4be862dae38a6d09b0299b44dd3555d: Status 404 returned error can't find the container with id 17be75099efdb218b1bd79db7825a183c4be862dae38a6d09b0299b44dd3555d Apr 17 17:31:06.983850 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:06.983819 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" event={"ID":"a50faf80-a870-4209-9a73-8dc84fd00c4b","Type":"ContainerStarted","Data":"f96e2780f5048d8a23518b0161d24c64f8d223630a742d5cd0251deb1d02f7ed"} Apr 17 17:31:07.988568 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:07.988529 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-vm8r6" event={"ID":"516ed833-a365-4252-945b-a1f54e70350b","Type":"ContainerStarted","Data":"17be75099efdb218b1bd79db7825a183c4be862dae38a6d09b0299b44dd3555d"} Apr 17 17:31:08.993475 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:08.993420 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" event={"ID":"a50faf80-a870-4209-9a73-8dc84fd00c4b","Type":"ContainerStarted","Data":"130592e3dec20fe0f5aa5a4535781d038702ecb604a4cc7df59a652ae58ec29d"} Apr 17 17:31:08.993942 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:08.993602 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" Apr 17 17:31:09.011020 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:09.010852 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" podStartSLOduration=1.110839806 podStartE2EDuration="3.010836339s" podCreationTimestamp="2026-04-17 17:31:06 +0000 UTC" firstStartedPulling="2026-04-17 17:31:06.952305995 +0000 UTC m=+432.923589314" lastFinishedPulling="2026-04-17 17:31:08.852302526 +0000 UTC m=+434.823585847" observedRunningTime="2026-04-17 17:31:09.010432941 +0000 UTC m=+434.981716283" watchObservedRunningTime="2026-04-17 17:31:09.010836339 +0000 UTC m=+434.982119744" Apr 17 17:31:11.000890 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:11.000849 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-vm8r6" event={"ID":"516ed833-a365-4252-945b-a1f54e70350b","Type":"ContainerStarted","Data":"3b191c3e6a4bed783515a4c85b32fe2ce459b0f87a577fd50e575aa6c33a57c4"} Apr 17 17:31:11.001310 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:11.000978 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-vm8r6" Apr 17 17:31:11.017956 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:11.017906 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-vm8r6" podStartSLOduration=1.562721211 podStartE2EDuration="5.017891717s" podCreationTimestamp="2026-04-17 17:31:06 +0000 UTC" firstStartedPulling="2026-04-17 17:31:06.982126655 +0000 UTC m=+432.953409979" lastFinishedPulling="2026-04-17 17:31:10.43729716 +0000 UTC m=+436.408580485" observedRunningTime="2026-04-17 17:31:11.016563696 +0000 UTC m=+436.987847039" watchObservedRunningTime="2026-04-17 17:31:11.017891717 +0000 UTC m=+436.989175058" Apr 17 17:31:17.006702 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:17.006614 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-vm8r6" Apr 17 17:31:39.999492 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:39.999457 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-fkxfv" Apr 17 17:31:51.610927 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.610891 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9786d7974-l64jm"] Apr 17 17:31:51.613149 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.613130 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.616068 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.616040 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:31:51.617180 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.617156 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:31:51.617346 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.617200 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:31:51.617346 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.617230 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:31:51.617346 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.617249 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:31:51.617346 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.617233 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:31:51.617346 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.617208 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:31:51.617346 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.617208 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7hcbl\"" Apr 17 17:31:51.621183 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.621164 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:31:51.626567 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.626547 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9786d7974-l64jm"] Apr 17 17:31:51.696491 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.696456 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-console-config\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.696491 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.696494 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-trusted-ca-bundle\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.696734 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.696554 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c5e1428-d554-470d-bace-00baf72c619f-console-oauth-config\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.696734 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.696618 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t59vg\" (UniqueName: \"kubernetes.io/projected/7c5e1428-d554-470d-bace-00baf72c619f-kube-api-access-t59vg\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.696734 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.696646 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c5e1428-d554-470d-bace-00baf72c619f-console-serving-cert\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.696734 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.696666 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-oauth-serving-cert\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.696734 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.696690 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-service-ca\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.797671 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.797638 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c5e1428-d554-470d-bace-00baf72c619f-console-oauth-config\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.797863 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.797683 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t59vg\" (UniqueName: \"kubernetes.io/projected/7c5e1428-d554-470d-bace-00baf72c619f-kube-api-access-t59vg\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.797863 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.797726 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c5e1428-d554-470d-bace-00baf72c619f-console-serving-cert\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.797863 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.797747 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-oauth-serving-cert\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.797863 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.797773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-service-ca\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.797863 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.797823 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-console-config\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.797863 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.797849 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-trusted-ca-bundle\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.798620 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.798568 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-console-config\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.798736 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.798575 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-service-ca\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.798736 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.798635 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-trusted-ca-bundle\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.798736 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.798717 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c5e1428-d554-470d-bace-00baf72c619f-oauth-serving-cert\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.800687 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.800670 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c5e1428-d554-470d-bace-00baf72c619f-console-serving-cert\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.800787 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.800765 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c5e1428-d554-470d-bace-00baf72c619f-console-oauth-config\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.807017 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.806991 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t59vg\" (UniqueName: \"kubernetes.io/projected/7c5e1428-d554-470d-bace-00baf72c619f-kube-api-access-t59vg\") pod \"console-9786d7974-l64jm\" (UID: \"7c5e1428-d554-470d-bace-00baf72c619f\") " pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:51.922750 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:51.922654 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:31:52.051141 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:52.047891 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9786d7974-l64jm"] Apr 17 17:31:52.051385 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:31:52.051358 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c5e1428_d554_470d_bace_00baf72c619f.slice/crio-343cb7457e439f634b085012371aebc57dcddcf61ecebf8aa68b51d90a7a92c6 WatchSource:0}: Error finding container 343cb7457e439f634b085012371aebc57dcddcf61ecebf8aa68b51d90a7a92c6: Status 404 returned error can't find the container with id 343cb7457e439f634b085012371aebc57dcddcf61ecebf8aa68b51d90a7a92c6 Apr 17 17:31:52.121854 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:52.121816 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9786d7974-l64jm" event={"ID":"7c5e1428-d554-470d-bace-00baf72c619f","Type":"ContainerStarted","Data":"3d1d7f08ee143e19a4fc03fa43ea2947571e85d4db8db9ea2f9b0d3cfaa04b25"} Apr 17 17:31:52.121989 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:52.121859 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9786d7974-l64jm" event={"ID":"7c5e1428-d554-470d-bace-00baf72c619f","Type":"ContainerStarted","Data":"343cb7457e439f634b085012371aebc57dcddcf61ecebf8aa68b51d90a7a92c6"} Apr 17 17:31:52.141571 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:31:52.141513 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9786d7974-l64jm" podStartSLOduration=1.141495629 podStartE2EDuration="1.141495629s" podCreationTimestamp="2026-04-17 17:31:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:31:52.140664949 +0000 UTC m=+478.111948289" watchObservedRunningTime="2026-04-17 17:31:52.141495629 +0000 UTC m=+478.112778969" Apr 17 17:32:01.923285 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:01.923235 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:32:01.923786 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:01.923327 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:32:01.928151 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:01.928125 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:32:02.153282 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:02.153247 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9786d7974-l64jm" Apr 17 17:32:31.569198 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:31.569114 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-dc8r8"] Apr 17 17:32:31.571485 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:31.571468 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dc8r8" Apr 17 17:32:31.579824 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:31.579791 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-dc8r8"] Apr 17 17:32:31.643631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:31.643570 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmzm7\" (UniqueName: \"kubernetes.io/projected/435b8490-b210-434f-b30e-2543e0137e4f-kube-api-access-qmzm7\") pod \"s3-init-dc8r8\" (UID: \"435b8490-b210-434f-b30e-2543e0137e4f\") " pod="kserve/s3-init-dc8r8" Apr 17 17:32:31.743994 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:31.743966 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmzm7\" (UniqueName: \"kubernetes.io/projected/435b8490-b210-434f-b30e-2543e0137e4f-kube-api-access-qmzm7\") pod \"s3-init-dc8r8\" (UID: \"435b8490-b210-434f-b30e-2543e0137e4f\") " pod="kserve/s3-init-dc8r8" Apr 17 17:32:31.753669 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:31.753639 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmzm7\" (UniqueName: \"kubernetes.io/projected/435b8490-b210-434f-b30e-2543e0137e4f-kube-api-access-qmzm7\") pod \"s3-init-dc8r8\" (UID: \"435b8490-b210-434f-b30e-2543e0137e4f\") " pod="kserve/s3-init-dc8r8" Apr 17 17:32:31.881521 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:31.881483 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dc8r8" Apr 17 17:32:32.004831 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:32.004792 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-dc8r8"] Apr 17 17:32:32.007766 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:32:32.007739 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod435b8490_b210_434f_b30e_2543e0137e4f.slice/crio-4c430478e07850806b11e7849c45c1d0edf626f2508d651065a5a88457b1f59d WatchSource:0}: Error finding container 4c430478e07850806b11e7849c45c1d0edf626f2508d651065a5a88457b1f59d: Status 404 returned error can't find the container with id 4c430478e07850806b11e7849c45c1d0edf626f2508d651065a5a88457b1f59d Apr 17 17:32:32.237889 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:32.237804 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dc8r8" event={"ID":"435b8490-b210-434f-b30e-2543e0137e4f","Type":"ContainerStarted","Data":"4c430478e07850806b11e7849c45c1d0edf626f2508d651065a5a88457b1f59d"} Apr 17 17:32:37.256018 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:37.255980 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dc8r8" event={"ID":"435b8490-b210-434f-b30e-2543e0137e4f","Type":"ContainerStarted","Data":"67d505aaed812bb7f6b0279d0d3aa659dc07e5033a73a0bc995ef34a8a7ec071"} Apr 17 17:32:37.272340 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:37.272286 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-dc8r8" podStartSLOduration=1.9080720530000002 podStartE2EDuration="6.272266259s" podCreationTimestamp="2026-04-17 17:32:31 +0000 UTC" firstStartedPulling="2026-04-17 17:32:32.009537912 +0000 UTC m=+517.980821231" lastFinishedPulling="2026-04-17 17:32:36.373732114 +0000 UTC m=+522.345015437" observedRunningTime="2026-04-17 17:32:37.270507201 +0000 UTC m=+523.241790554" watchObservedRunningTime="2026-04-17 17:32:37.272266259 +0000 UTC m=+523.243549600" Apr 17 17:32:40.267646 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:40.267612 2580 generic.go:358] "Generic (PLEG): container finished" podID="435b8490-b210-434f-b30e-2543e0137e4f" containerID="67d505aaed812bb7f6b0279d0d3aa659dc07e5033a73a0bc995ef34a8a7ec071" exitCode=0 Apr 17 17:32:40.268069 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:40.267682 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dc8r8" event={"ID":"435b8490-b210-434f-b30e-2543e0137e4f","Type":"ContainerDied","Data":"67d505aaed812bb7f6b0279d0d3aa659dc07e5033a73a0bc995ef34a8a7ec071"} Apr 17 17:32:41.395567 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:41.395545 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dc8r8" Apr 17 17:32:41.424595 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:41.424524 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmzm7\" (UniqueName: \"kubernetes.io/projected/435b8490-b210-434f-b30e-2543e0137e4f-kube-api-access-qmzm7\") pod \"435b8490-b210-434f-b30e-2543e0137e4f\" (UID: \"435b8490-b210-434f-b30e-2543e0137e4f\") " Apr 17 17:32:41.427103 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:41.427072 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435b8490-b210-434f-b30e-2543e0137e4f-kube-api-access-qmzm7" (OuterVolumeSpecName: "kube-api-access-qmzm7") pod "435b8490-b210-434f-b30e-2543e0137e4f" (UID: "435b8490-b210-434f-b30e-2543e0137e4f"). InnerVolumeSpecName "kube-api-access-qmzm7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:32:41.526009 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:41.525914 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qmzm7\" (UniqueName: \"kubernetes.io/projected/435b8490-b210-434f-b30e-2543e0137e4f-kube-api-access-qmzm7\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:32:42.274926 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:42.274894 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dc8r8" event={"ID":"435b8490-b210-434f-b30e-2543e0137e4f","Type":"ContainerDied","Data":"4c430478e07850806b11e7849c45c1d0edf626f2508d651065a5a88457b1f59d"} Apr 17 17:32:42.274926 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:42.274917 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dc8r8" Apr 17 17:32:42.274926 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:42.274925 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c430478e07850806b11e7849c45c1d0edf626f2508d651065a5a88457b1f59d" Apr 17 17:32:51.578143 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.578063 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2"] Apr 17 17:32:51.578605 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.578574 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="435b8490-b210-434f-b30e-2543e0137e4f" containerName="s3-init" Apr 17 17:32:51.578653 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.578609 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="435b8490-b210-434f-b30e-2543e0137e4f" containerName="s3-init" Apr 17 17:32:51.578722 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.578711 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="435b8490-b210-434f-b30e-2543e0137e4f" containerName="s3-init" Apr 17 17:32:51.581130 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.581110 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.583796 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.583764 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 17 17:32:51.583796 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.583794 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 17 17:32:51.583994 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.583773 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-fbgtt\"" Apr 17 17:32:51.583994 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.583778 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:32:51.585074 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.585058 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:32:51.592178 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.592156 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2"] Apr 17 17:32:51.616497 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.616459 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f19b3998-85b3-40b4-89f4-c66414780c36-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.616670 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.616529 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f19b3998-85b3-40b4-89f4-c66414780c36-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.616670 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.616559 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f19b3998-85b3-40b4-89f4-c66414780c36-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.616670 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.616663 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgwzm\" (UniqueName: \"kubernetes.io/projected/f19b3998-85b3-40b4-89f4-c66414780c36-kube-api-access-lgwzm\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.717366 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.717325 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f19b3998-85b3-40b4-89f4-c66414780c36-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.717366 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.717373 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f19b3998-85b3-40b4-89f4-c66414780c36-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.717658 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.717409 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgwzm\" (UniqueName: \"kubernetes.io/projected/f19b3998-85b3-40b4-89f4-c66414780c36-kube-api-access-lgwzm\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.717658 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.717451 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f19b3998-85b3-40b4-89f4-c66414780c36-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.717805 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:32:51.717685 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-predictor-serving-cert: secret "isvc-xgboost-graph-predictor-serving-cert" not found Apr 17 17:32:51.717805 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:32:51.717789 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f19b3998-85b3-40b4-89f4-c66414780c36-proxy-tls podName:f19b3998-85b3-40b4-89f4-c66414780c36 nodeName:}" failed. No retries permitted until 2026-04-17 17:32:52.217766099 +0000 UTC m=+538.189049422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f19b3998-85b3-40b4-89f4-c66414780c36-proxy-tls") pod "isvc-xgboost-graph-predictor-669d8d6456-jpmw2" (UID: "f19b3998-85b3-40b4-89f4-c66414780c36") : secret "isvc-xgboost-graph-predictor-serving-cert" not found Apr 17 17:32:51.717923 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.717898 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f19b3998-85b3-40b4-89f4-c66414780c36-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.718141 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.718118 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f19b3998-85b3-40b4-89f4-c66414780c36-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.726694 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.726666 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgwzm\" (UniqueName: \"kubernetes.io/projected/f19b3998-85b3-40b4-89f4-c66414780c36-kube-api-access-lgwzm\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:51.752537 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.752499 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb"] Apr 17 17:32:51.755084 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.755063 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:51.757750 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.757724 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-235f3-predictor-serving-cert\"" Apr 17 17:32:51.757869 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.757749 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-235f3-kube-rbac-proxy-sar-config\"" Apr 17 17:32:51.766029 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.766007 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb"] Apr 17 17:32:51.818317 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.818279 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-235f3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fbffdfe-842e-4cb3-850f-d16083151446-error-404-isvc-235f3-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:51.818501 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.818346 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbffdfe-842e-4cb3-850f-d16083151446-proxy-tls\") pod \"error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:51.818501 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.818423 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwj8x\" (UniqueName: \"kubernetes.io/projected/0fbffdfe-842e-4cb3-850f-d16083151446-kube-api-access-qwj8x\") pod \"error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:51.919606 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.919490 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbffdfe-842e-4cb3-850f-d16083151446-proxy-tls\") pod \"error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:51.919606 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.919551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwj8x\" (UniqueName: \"kubernetes.io/projected/0fbffdfe-842e-4cb3-850f-d16083151446-kube-api-access-qwj8x\") pod \"error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:51.919849 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:32:51.919638 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-235f3-predictor-serving-cert: secret "error-404-isvc-235f3-predictor-serving-cert" not found Apr 17 17:32:51.919849 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.919662 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-235f3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fbffdfe-842e-4cb3-850f-d16083151446-error-404-isvc-235f3-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:51.919849 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:32:51.919714 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fbffdfe-842e-4cb3-850f-d16083151446-proxy-tls podName:0fbffdfe-842e-4cb3-850f-d16083151446 nodeName:}" failed. No retries permitted until 2026-04-17 17:32:52.419689684 +0000 UTC m=+538.390973006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0fbffdfe-842e-4cb3-850f-d16083151446-proxy-tls") pod "error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" (UID: "0fbffdfe-842e-4cb3-850f-d16083151446") : secret "error-404-isvc-235f3-predictor-serving-cert" not found Apr 17 17:32:51.920262 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.920243 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-235f3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fbffdfe-842e-4cb3-850f-d16083151446-error-404-isvc-235f3-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:51.930238 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:51.930208 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwj8x\" (UniqueName: \"kubernetes.io/projected/0fbffdfe-842e-4cb3-850f-d16083151446-kube-api-access-qwj8x\") pod \"error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:52.222200 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.222102 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f19b3998-85b3-40b4-89f4-c66414780c36-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:52.224703 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.224677 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f19b3998-85b3-40b4-89f4-c66414780c36-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jpmw2\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:52.424170 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.424132 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbffdfe-842e-4cb3-850f-d16083151446-proxy-tls\") pod \"error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:52.426585 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.426555 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbffdfe-842e-4cb3-850f-d16083151446-proxy-tls\") pod \"error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:52.469496 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.469461 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq"] Apr 17 17:32:52.472167 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.472149 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.474589 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.474534 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 17 17:32:52.474762 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.474746 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 17 17:32:52.481788 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.481766 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq"] Apr 17 17:32:52.492099 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.492075 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:32:52.525405 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.525370 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf2da75f-8df0-44f2-8533-584f97edbb63-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.525405 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.525409 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf2da75f-8df0-44f2-8533-584f97edbb63-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.525685 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.525442 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf2da75f-8df0-44f2-8533-584f97edbb63-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.525685 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.525535 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z6vz\" (UniqueName: \"kubernetes.io/projected/bf2da75f-8df0-44f2-8533-584f97edbb63-kube-api-access-4z6vz\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.611985 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.611960 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2"] Apr 17 17:32:52.614307 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:32:52.614277 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf19b3998_85b3_40b4_89f4_c66414780c36.slice/crio-ab2cbc03d8ddbcd9aaddbc15dd20e3d2527e3019756d309e1fdbe3edd21209c5 WatchSource:0}: Error finding container ab2cbc03d8ddbcd9aaddbc15dd20e3d2527e3019756d309e1fdbe3edd21209c5: Status 404 returned error can't find the container with id ab2cbc03d8ddbcd9aaddbc15dd20e3d2527e3019756d309e1fdbe3edd21209c5 Apr 17 17:32:52.626168 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.626144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf2da75f-8df0-44f2-8533-584f97edbb63-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.626330 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.626183 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf2da75f-8df0-44f2-8533-584f97edbb63-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.626330 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.626226 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf2da75f-8df0-44f2-8533-584f97edbb63-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.626330 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.626271 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4z6vz\" (UniqueName: \"kubernetes.io/projected/bf2da75f-8df0-44f2-8533-584f97edbb63-kube-api-access-4z6vz\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.626643 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.626623 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf2da75f-8df0-44f2-8533-584f97edbb63-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.626893 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.626874 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf2da75f-8df0-44f2-8533-584f97edbb63-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.628519 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.628501 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf2da75f-8df0-44f2-8533-584f97edbb63-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.633992 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.633975 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z6vz\" (UniqueName: \"kubernetes.io/projected/bf2da75f-8df0-44f2-8533-584f97edbb63-kube-api-access-4z6vz\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.667419 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.667385 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:32:52.783812 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.783778 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:32:52.792460 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.792435 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb"] Apr 17 17:32:52.794264 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:32:52.794235 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fbffdfe_842e_4cb3_850f_d16083151446.slice/crio-c7584224ec4271aeb52fc2b85913f21ef0aa19003465d02cdf3bbb73c6f69360 WatchSource:0}: Error finding container c7584224ec4271aeb52fc2b85913f21ef0aa19003465d02cdf3bbb73c6f69360: Status 404 returned error can't find the container with id c7584224ec4271aeb52fc2b85913f21ef0aa19003465d02cdf3bbb73c6f69360 Apr 17 17:32:52.905917 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:52.905893 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq"] Apr 17 17:32:52.908594 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:32:52.908546 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2da75f_8df0_44f2_8533_584f97edbb63.slice/crio-78f256df82feef9c866b7eb98ef2332997ac09724afdcdd1053f1a87b8e2bbc3 WatchSource:0}: Error finding container 78f256df82feef9c866b7eb98ef2332997ac09724afdcdd1053f1a87b8e2bbc3: Status 404 returned error can't find the container with id 78f256df82feef9c866b7eb98ef2332997ac09724afdcdd1053f1a87b8e2bbc3 Apr 17 17:32:53.311605 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:53.311513 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" event={"ID":"0fbffdfe-842e-4cb3-850f-d16083151446","Type":"ContainerStarted","Data":"c7584224ec4271aeb52fc2b85913f21ef0aa19003465d02cdf3bbb73c6f69360"} Apr 17 17:32:53.313238 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:53.313179 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" event={"ID":"f19b3998-85b3-40b4-89f4-c66414780c36","Type":"ContainerStarted","Data":"ab2cbc03d8ddbcd9aaddbc15dd20e3d2527e3019756d309e1fdbe3edd21209c5"} Apr 17 17:32:53.314913 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:53.314884 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" event={"ID":"bf2da75f-8df0-44f2-8533-584f97edbb63","Type":"ContainerStarted","Data":"78f256df82feef9c866b7eb98ef2332997ac09724afdcdd1053f1a87b8e2bbc3"} Apr 17 17:32:58.339269 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:58.339221 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" event={"ID":"bf2da75f-8df0-44f2-8533-584f97edbb63","Type":"ContainerStarted","Data":"45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62"} Apr 17 17:32:58.341302 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:32:58.341268 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" event={"ID":"f19b3998-85b3-40b4-89f4-c66414780c36","Type":"ContainerStarted","Data":"f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27"} Apr 17 17:33:04.366007 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:04.365963 2580 generic.go:358] "Generic (PLEG): container finished" podID="f19b3998-85b3-40b4-89f4-c66414780c36" containerID="f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27" exitCode=0 Apr 17 17:33:04.366433 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:04.366040 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" event={"ID":"f19b3998-85b3-40b4-89f4-c66414780c36","Type":"ContainerDied","Data":"f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27"} Apr 17 17:33:04.367703 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:04.367676 2580 generic.go:358] "Generic (PLEG): container finished" podID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerID="45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62" exitCode=0 Apr 17 17:33:04.367830 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:04.367755 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" event={"ID":"bf2da75f-8df0-44f2-8533-584f97edbb63","Type":"ContainerDied","Data":"45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62"} Apr 17 17:33:05.375406 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:05.375313 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" event={"ID":"0fbffdfe-842e-4cb3-850f-d16083151446","Type":"ContainerStarted","Data":"b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d"} Apr 17 17:33:13.412520 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:13.412484 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" event={"ID":"bf2da75f-8df0-44f2-8533-584f97edbb63","Type":"ContainerStarted","Data":"859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c"} Apr 17 17:33:13.413055 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:13.412529 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" event={"ID":"bf2da75f-8df0-44f2-8533-584f97edbb63","Type":"ContainerStarted","Data":"7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556"} Apr 17 17:33:13.413055 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:13.412896 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:33:13.414205 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:13.414173 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" event={"ID":"0fbffdfe-842e-4cb3-850f-d16083151446","Type":"ContainerStarted","Data":"121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2"} Apr 17 17:33:13.414457 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:13.414419 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:33:13.414457 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:13.414443 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:33:13.415819 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:13.415767 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:33:13.435466 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:13.435409 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podStartSLOduration=1.908564358 podStartE2EDuration="21.435394233s" podCreationTimestamp="2026-04-17 17:32:52 +0000 UTC" firstStartedPulling="2026-04-17 17:32:52.910352696 +0000 UTC m=+538.881636016" lastFinishedPulling="2026-04-17 17:33:12.437182572 +0000 UTC m=+558.408465891" observedRunningTime="2026-04-17 17:33:13.432634876 +0000 UTC m=+559.403918216" watchObservedRunningTime="2026-04-17 17:33:13.435394233 +0000 UTC m=+559.406677573" Apr 17 17:33:13.454167 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:13.454100 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" podStartSLOduration=2.813995163 podStartE2EDuration="22.454081583s" podCreationTimestamp="2026-04-17 17:32:51 +0000 UTC" firstStartedPulling="2026-04-17 17:32:52.796108928 +0000 UTC m=+538.767392246" lastFinishedPulling="2026-04-17 17:33:12.436195347 +0000 UTC m=+558.407478666" observedRunningTime="2026-04-17 17:33:13.453621142 +0000 UTC m=+559.424904479" watchObservedRunningTime="2026-04-17 17:33:13.454081583 +0000 UTC m=+559.425364926" Apr 17 17:33:14.417597 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:14.417554 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:33:14.417597 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:14.417562 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:33:14.418623 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:14.418575 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:33:15.421891 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:15.421851 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:33:19.422554 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:19.422518 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:33:19.423204 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:19.423128 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:33:20.427336 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:20.427300 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:33:20.428061 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:20.428031 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:33:27.471905 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:27.471870 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" event={"ID":"f19b3998-85b3-40b4-89f4-c66414780c36","Type":"ContainerStarted","Data":"45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179"} Apr 17 17:33:27.471905 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:27.471911 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" event={"ID":"f19b3998-85b3-40b4-89f4-c66414780c36","Type":"ContainerStarted","Data":"77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43"} Apr 17 17:33:27.472364 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:27.472201 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:33:27.472364 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:27.472341 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:33:27.473414 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:27.473383 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:33:27.491755 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:27.491712 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podStartSLOduration=2.171388001 podStartE2EDuration="36.491700044s" podCreationTimestamp="2026-04-17 17:32:51 +0000 UTC" firstStartedPulling="2026-04-17 17:32:52.616114689 +0000 UTC m=+538.587398008" lastFinishedPulling="2026-04-17 17:33:26.936426726 +0000 UTC m=+572.907710051" observedRunningTime="2026-04-17 17:33:27.490617294 +0000 UTC m=+573.461900640" watchObservedRunningTime="2026-04-17 17:33:27.491700044 +0000 UTC m=+573.462983385" Apr 17 17:33:28.475987 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:28.475936 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:33:29.423283 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:29.423240 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:33:30.428798 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:30.428754 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:33:33.480093 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:33.480058 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:33:33.480573 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:33.480543 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:33:39.423830 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:39.423791 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:33:40.428771 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:40.428733 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:33:43.481007 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:43.480961 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:33:49.423983 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:49.423944 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 17:33:50.428488 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:50.428450 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:33:53.480572 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:53.480530 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:33:54.556912 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:54.556887 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:33:54.557515 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:54.557488 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:33:59.423716 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:33:59.423687 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:34:00.428388 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:00.428348 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:34:03.480905 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:03.480852 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:34:10.428076 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:10.428038 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:34:13.481221 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:13.481140 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:34:20.428331 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:20.428292 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 17 17:34:21.616682 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.616649 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb"] Apr 17 17:34:21.617049 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.617028 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kserve-container" containerID="cri-o://b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d" gracePeriod=30 Apr 17 17:34:21.617107 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.617064 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kube-rbac-proxy" containerID="cri-o://121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2" gracePeriod=30 Apr 17 17:34:21.748846 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.748810 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx"] Apr 17 17:34:21.752248 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.752231 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:21.754738 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.754709 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d987b-predictor-serving-cert\"" Apr 17 17:34:21.754738 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.754733 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d987b-kube-rbac-proxy-sar-config\"" Apr 17 17:34:21.762918 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.762887 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx"] Apr 17 17:34:21.829887 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.829857 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3291e5f9-dbb1-4380-81de-6f318659f1c2-proxy-tls\") pod \"error-404-isvc-d987b-predictor-5666cb86b9-jz6hx\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:21.830068 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.829894 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-d987b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3291e5f9-dbb1-4380-81de-6f318659f1c2-error-404-isvc-d987b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d987b-predictor-5666cb86b9-jz6hx\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:21.830068 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.829942 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9cwn\" (UniqueName: \"kubernetes.io/projected/3291e5f9-dbb1-4380-81de-6f318659f1c2-kube-api-access-g9cwn\") pod \"error-404-isvc-d987b-predictor-5666cb86b9-jz6hx\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:21.930700 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.930605 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3291e5f9-dbb1-4380-81de-6f318659f1c2-proxy-tls\") pod \"error-404-isvc-d987b-predictor-5666cb86b9-jz6hx\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:21.930700 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.930641 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-d987b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3291e5f9-dbb1-4380-81de-6f318659f1c2-error-404-isvc-d987b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d987b-predictor-5666cb86b9-jz6hx\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:21.930700 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.930662 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9cwn\" (UniqueName: \"kubernetes.io/projected/3291e5f9-dbb1-4380-81de-6f318659f1c2-kube-api-access-g9cwn\") pod \"error-404-isvc-d987b-predictor-5666cb86b9-jz6hx\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:21.931221 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.931198 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-d987b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3291e5f9-dbb1-4380-81de-6f318659f1c2-error-404-isvc-d987b-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d987b-predictor-5666cb86b9-jz6hx\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:21.933346 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.933319 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3291e5f9-dbb1-4380-81de-6f318659f1c2-proxy-tls\") pod \"error-404-isvc-d987b-predictor-5666cb86b9-jz6hx\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:21.938308 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:21.938280 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9cwn\" (UniqueName: \"kubernetes.io/projected/3291e5f9-dbb1-4380-81de-6f318659f1c2-kube-api-access-g9cwn\") pod \"error-404-isvc-d987b-predictor-5666cb86b9-jz6hx\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:22.063358 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:22.063318 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:22.196018 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:22.195991 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx"] Apr 17 17:34:22.198220 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:34:22.198193 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3291e5f9_dbb1_4380_81de_6f318659f1c2.slice/crio-7466b94d8028c46e48e408013844c15ea218c5b0732150d1e4d6f4e7bf070306 WatchSource:0}: Error finding container 7466b94d8028c46e48e408013844c15ea218c5b0732150d1e4d6f4e7bf070306: Status 404 returned error can't find the container with id 7466b94d8028c46e48e408013844c15ea218c5b0732150d1e4d6f4e7bf070306 Apr 17 17:34:22.654429 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:22.654396 2580 generic.go:358] "Generic (PLEG): container finished" podID="0fbffdfe-842e-4cb3-850f-d16083151446" containerID="121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2" exitCode=2 Apr 17 17:34:22.654875 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:22.654468 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" event={"ID":"0fbffdfe-842e-4cb3-850f-d16083151446","Type":"ContainerDied","Data":"121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2"} Apr 17 17:34:22.656061 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:22.656035 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" event={"ID":"3291e5f9-dbb1-4380-81de-6f318659f1c2","Type":"ContainerStarted","Data":"db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc"} Apr 17 17:34:22.656200 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:22.656067 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" event={"ID":"3291e5f9-dbb1-4380-81de-6f318659f1c2","Type":"ContainerStarted","Data":"1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e"} Apr 17 17:34:22.656200 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:22.656081 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" event={"ID":"3291e5f9-dbb1-4380-81de-6f318659f1c2","Type":"ContainerStarted","Data":"7466b94d8028c46e48e408013844c15ea218c5b0732150d1e4d6f4e7bf070306"} Apr 17 17:34:22.656200 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:22.656145 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:22.673083 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:22.673041 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" podStartSLOduration=1.673028446 podStartE2EDuration="1.673028446s" podCreationTimestamp="2026-04-17 17:34:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:34:22.671299405 +0000 UTC m=+628.642582746" watchObservedRunningTime="2026-04-17 17:34:22.673028446 +0000 UTC m=+628.644311786" Apr 17 17:34:23.481356 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:23.481315 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:34:23.660034 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:23.660005 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:23.661272 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:23.661237 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:34:24.418396 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:24.418354 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 17 17:34:24.663987 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:24.663942 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:34:24.871895 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:24.871873 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:34:24.957900 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:24.957815 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbffdfe-842e-4cb3-850f-d16083151446-proxy-tls\") pod \"0fbffdfe-842e-4cb3-850f-d16083151446\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " Apr 17 17:34:24.957900 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:24.957892 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwj8x\" (UniqueName: \"kubernetes.io/projected/0fbffdfe-842e-4cb3-850f-d16083151446-kube-api-access-qwj8x\") pod \"0fbffdfe-842e-4cb3-850f-d16083151446\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " Apr 17 17:34:24.958088 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:24.957938 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-235f3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fbffdfe-842e-4cb3-850f-d16083151446-error-404-isvc-235f3-kube-rbac-proxy-sar-config\") pod \"0fbffdfe-842e-4cb3-850f-d16083151446\" (UID: \"0fbffdfe-842e-4cb3-850f-d16083151446\") " Apr 17 17:34:24.958361 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:24.958324 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fbffdfe-842e-4cb3-850f-d16083151446-error-404-isvc-235f3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-235f3-kube-rbac-proxy-sar-config") pod "0fbffdfe-842e-4cb3-850f-d16083151446" (UID: "0fbffdfe-842e-4cb3-850f-d16083151446"). InnerVolumeSpecName "error-404-isvc-235f3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:34:24.960156 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:24.960131 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbffdfe-842e-4cb3-850f-d16083151446-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0fbffdfe-842e-4cb3-850f-d16083151446" (UID: "0fbffdfe-842e-4cb3-850f-d16083151446"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:34:24.960249 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:24.960158 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fbffdfe-842e-4cb3-850f-d16083151446-kube-api-access-qwj8x" (OuterVolumeSpecName: "kube-api-access-qwj8x") pod "0fbffdfe-842e-4cb3-850f-d16083151446" (UID: "0fbffdfe-842e-4cb3-850f-d16083151446"). InnerVolumeSpecName "kube-api-access-qwj8x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:34:25.058850 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.058792 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwj8x\" (UniqueName: \"kubernetes.io/projected/0fbffdfe-842e-4cb3-850f-d16083151446-kube-api-access-qwj8x\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:34:25.058850 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.058845 2580 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-235f3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fbffdfe-842e-4cb3-850f-d16083151446-error-404-isvc-235f3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:34:25.058850 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.058857 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbffdfe-842e-4cb3-850f-d16083151446-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:34:25.669701 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.669662 2580 generic.go:358] "Generic (PLEG): container finished" podID="0fbffdfe-842e-4cb3-850f-d16083151446" containerID="b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d" exitCode=0 Apr 17 17:34:25.670194 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.669750 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" Apr 17 17:34:25.670194 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.669748 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" event={"ID":"0fbffdfe-842e-4cb3-850f-d16083151446","Type":"ContainerDied","Data":"b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d"} Apr 17 17:34:25.670194 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.669794 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb" event={"ID":"0fbffdfe-842e-4cb3-850f-d16083151446","Type":"ContainerDied","Data":"c7584224ec4271aeb52fc2b85913f21ef0aa19003465d02cdf3bbb73c6f69360"} Apr 17 17:34:25.670194 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.669818 2580 scope.go:117] "RemoveContainer" containerID="121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2" Apr 17 17:34:25.678463 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.678444 2580 scope.go:117] "RemoveContainer" containerID="b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d" Apr 17 17:34:25.685685 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.685664 2580 scope.go:117] "RemoveContainer" containerID="121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2" Apr 17 17:34:25.685931 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:34:25.685913 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2\": container with ID starting with 121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2 not found: ID does not exist" containerID="121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2" Apr 17 17:34:25.685975 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.685940 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2"} err="failed to get container status \"121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2\": rpc error: code = NotFound desc = could not find container \"121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2\": container with ID starting with 121b4505c08d5c5046d2b2e14e7810d519afb313f7919b52f0a864f6e752fae2 not found: ID does not exist" Apr 17 17:34:25.685975 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.685958 2580 scope.go:117] "RemoveContainer" containerID="b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d" Apr 17 17:34:25.686441 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:34:25.686383 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d\": container with ID starting with b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d not found: ID does not exist" containerID="b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d" Apr 17 17:34:25.686441 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.686413 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d"} err="failed to get container status \"b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d\": rpc error: code = NotFound desc = could not find container \"b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d\": container with ID starting with b6c5ed9767ebd5360fc2725e9ff33333f6d93f951589da11900f06bfb6d4be2d not found: ID does not exist" Apr 17 17:34:25.692060 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.692025 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb"] Apr 17 17:34:25.693588 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:25.693564 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-235f3-predictor-6d9f8fbcff-zftlb"] Apr 17 17:34:26.641838 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:26.641802 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" path="/var/lib/kubelet/pods/0fbffdfe-842e-4cb3-850f-d16083151446/volumes" Apr 17 17:34:29.668631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:29.668602 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:34:29.669005 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:29.668952 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:34:30.429305 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:30.429277 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:34:33.481749 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:33.481719 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:34:39.669824 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:39.669777 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:34:49.669733 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:49.669689 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:34:59.670025 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:34:59.669979 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:35:09.670424 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:09.670392 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:35:11.522144 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.522105 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2"] Apr 17 17:35:11.522565 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.522442 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" containerID="cri-o://77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43" gracePeriod=30 Apr 17 17:35:11.522565 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.522494 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kube-rbac-proxy" containerID="cri-o://45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179" gracePeriod=30 Apr 17 17:35:11.585265 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.585232 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7"] Apr 17 17:35:11.585823 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.585808 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kube-rbac-proxy" Apr 17 17:35:11.585883 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.585825 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kube-rbac-proxy" Apr 17 17:35:11.585883 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.585854 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kserve-container" Apr 17 17:35:11.585883 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.585864 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kserve-container" Apr 17 17:35:11.585981 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.585944 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kube-rbac-proxy" Apr 17 17:35:11.585981 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.585957 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fbffdfe-842e-4cb3-850f-d16083151446" containerName="kserve-container" Apr 17 17:35:11.590855 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.590829 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:11.593500 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.593478 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-fa93d-predictor-serving-cert\"" Apr 17 17:35:11.593643 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.593483 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-fa93d-kube-rbac-proxy-sar-config\"" Apr 17 17:35:11.596962 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.596928 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq"] Apr 17 17:35:11.598027 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.597414 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" containerID="cri-o://7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556" gracePeriod=30 Apr 17 17:35:11.598027 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.597435 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kube-rbac-proxy" containerID="cri-o://859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c" gracePeriod=30 Apr 17 17:35:11.600299 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.600238 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7"] Apr 17 17:35:11.666841 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.666808 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-fa93d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/53938468-83ca-43fd-8056-3e474d3956a9-error-404-isvc-fa93d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-fa93d-predictor-6ff7867c59-kgth7\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:11.667017 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.666919 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53938468-83ca-43fd-8056-3e474d3956a9-proxy-tls\") pod \"error-404-isvc-fa93d-predictor-6ff7867c59-kgth7\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:11.667017 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.666941 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnfgt\" (UniqueName: \"kubernetes.io/projected/53938468-83ca-43fd-8056-3e474d3956a9-kube-api-access-dnfgt\") pod \"error-404-isvc-fa93d-predictor-6ff7867c59-kgth7\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:11.768010 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.767973 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53938468-83ca-43fd-8056-3e474d3956a9-proxy-tls\") pod \"error-404-isvc-fa93d-predictor-6ff7867c59-kgth7\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:11.768197 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.768020 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnfgt\" (UniqueName: \"kubernetes.io/projected/53938468-83ca-43fd-8056-3e474d3956a9-kube-api-access-dnfgt\") pod \"error-404-isvc-fa93d-predictor-6ff7867c59-kgth7\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:11.768197 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.768079 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-fa93d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/53938468-83ca-43fd-8056-3e474d3956a9-error-404-isvc-fa93d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-fa93d-predictor-6ff7867c59-kgth7\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:11.768197 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:35:11.768144 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-serving-cert: secret "error-404-isvc-fa93d-predictor-serving-cert" not found Apr 17 17:35:11.768442 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:35:11.768224 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53938468-83ca-43fd-8056-3e474d3956a9-proxy-tls podName:53938468-83ca-43fd-8056-3e474d3956a9 nodeName:}" failed. No retries permitted until 2026-04-17 17:35:12.268202731 +0000 UTC m=+678.239486053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/53938468-83ca-43fd-8056-3e474d3956a9-proxy-tls") pod "error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" (UID: "53938468-83ca-43fd-8056-3e474d3956a9") : secret "error-404-isvc-fa93d-predictor-serving-cert" not found Apr 17 17:35:11.768731 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.768708 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-fa93d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/53938468-83ca-43fd-8056-3e474d3956a9-error-404-isvc-fa93d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-fa93d-predictor-6ff7867c59-kgth7\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:11.779188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.779121 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnfgt\" (UniqueName: \"kubernetes.io/projected/53938468-83ca-43fd-8056-3e474d3956a9-kube-api-access-dnfgt\") pod \"error-404-isvc-fa93d-predictor-6ff7867c59-kgth7\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:11.831050 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.831016 2580 generic.go:358] "Generic (PLEG): container finished" podID="f19b3998-85b3-40b4-89f4-c66414780c36" containerID="45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179" exitCode=2 Apr 17 17:35:11.831224 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.831092 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" event={"ID":"f19b3998-85b3-40b4-89f4-c66414780c36","Type":"ContainerDied","Data":"45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179"} Apr 17 17:35:11.832812 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.832787 2580 generic.go:358] "Generic (PLEG): container finished" podID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerID="859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c" exitCode=2 Apr 17 17:35:11.832931 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:11.832826 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" event={"ID":"bf2da75f-8df0-44f2-8533-584f97edbb63","Type":"ContainerDied","Data":"859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c"} Apr 17 17:35:12.273692 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:12.273658 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53938468-83ca-43fd-8056-3e474d3956a9-proxy-tls\") pod \"error-404-isvc-fa93d-predictor-6ff7867c59-kgth7\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:12.276001 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:12.275973 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53938468-83ca-43fd-8056-3e474d3956a9-proxy-tls\") pod \"error-404-isvc-fa93d-predictor-6ff7867c59-kgth7\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:12.504833 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:12.504795 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:12.628783 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:12.628753 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7"] Apr 17 17:35:12.632362 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:35:12.632328 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53938468_83ca_43fd_8056_3e474d3956a9.slice/crio-50bea3ee078afe25fb97080822a8a42c99a5ab8798c51bc17823f89f82cd261c WatchSource:0}: Error finding container 50bea3ee078afe25fb97080822a8a42c99a5ab8798c51bc17823f89f82cd261c: Status 404 returned error can't find the container with id 50bea3ee078afe25fb97080822a8a42c99a5ab8798c51bc17823f89f82cd261c Apr 17 17:35:12.634166 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:12.634152 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:35:12.838167 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:12.838130 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" event={"ID":"53938468-83ca-43fd-8056-3e474d3956a9","Type":"ContainerStarted","Data":"bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f"} Apr 17 17:35:12.838167 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:12.838164 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" event={"ID":"53938468-83ca-43fd-8056-3e474d3956a9","Type":"ContainerStarted","Data":"fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0"} Apr 17 17:35:12.838422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:12.838178 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" event={"ID":"53938468-83ca-43fd-8056-3e474d3956a9","Type":"ContainerStarted","Data":"50bea3ee078afe25fb97080822a8a42c99a5ab8798c51bc17823f89f82cd261c"} Apr 17 17:35:12.838422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:12.838272 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:12.859638 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:12.859489 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" podStartSLOduration=1.859473075 podStartE2EDuration="1.859473075s" podCreationTimestamp="2026-04-17 17:35:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:35:12.858694418 +0000 UTC m=+678.829977759" watchObservedRunningTime="2026-04-17 17:35:12.859473075 +0000 UTC m=+678.830756415" Apr 17 17:35:13.476443 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:13.476396 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 17 17:35:13.480785 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:13.480749 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 17 17:35:13.842482 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:13.842451 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:13.843848 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:13.843819 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 17:35:14.845289 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:14.845239 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 17:35:15.368544 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.368517 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:35:15.401220 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.401185 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f19b3998-85b3-40b4-89f4-c66414780c36-proxy-tls\") pod \"f19b3998-85b3-40b4-89f4-c66414780c36\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " Apr 17 17:35:15.401410 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.401240 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgwzm\" (UniqueName: \"kubernetes.io/projected/f19b3998-85b3-40b4-89f4-c66414780c36-kube-api-access-lgwzm\") pod \"f19b3998-85b3-40b4-89f4-c66414780c36\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " Apr 17 17:35:15.401410 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.401379 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f19b3998-85b3-40b4-89f4-c66414780c36-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"f19b3998-85b3-40b4-89f4-c66414780c36\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " Apr 17 17:35:15.401522 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.401441 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f19b3998-85b3-40b4-89f4-c66414780c36-kserve-provision-location\") pod \"f19b3998-85b3-40b4-89f4-c66414780c36\" (UID: \"f19b3998-85b3-40b4-89f4-c66414780c36\") " Apr 17 17:35:15.402064 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.401933 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f19b3998-85b3-40b4-89f4-c66414780c36-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "f19b3998-85b3-40b4-89f4-c66414780c36" (UID: "f19b3998-85b3-40b4-89f4-c66414780c36"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:35:15.402268 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.402071 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f19b3998-85b3-40b4-89f4-c66414780c36-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f19b3998-85b3-40b4-89f4-c66414780c36" (UID: "f19b3998-85b3-40b4-89f4-c66414780c36"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:35:15.403674 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.403642 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f19b3998-85b3-40b4-89f4-c66414780c36-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f19b3998-85b3-40b4-89f4-c66414780c36" (UID: "f19b3998-85b3-40b4-89f4-c66414780c36"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:35:15.403972 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.403943 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f19b3998-85b3-40b4-89f4-c66414780c36-kube-api-access-lgwzm" (OuterVolumeSpecName: "kube-api-access-lgwzm") pod "f19b3998-85b3-40b4-89f4-c66414780c36" (UID: "f19b3998-85b3-40b4-89f4-c66414780c36"). InnerVolumeSpecName "kube-api-access-lgwzm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:35:15.422429 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.422387 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.33:8643/healthz\": dial tcp 10.133.0.33:8643: connect: connection refused" Apr 17 17:35:15.502863 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.502772 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f19b3998-85b3-40b4-89f4-c66414780c36-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:35:15.502863 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.502802 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lgwzm\" (UniqueName: \"kubernetes.io/projected/f19b3998-85b3-40b4-89f4-c66414780c36-kube-api-access-lgwzm\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:35:15.502863 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.502813 2580 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f19b3998-85b3-40b4-89f4-c66414780c36-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:35:15.502863 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.502823 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f19b3998-85b3-40b4-89f4-c66414780c36-kserve-provision-location\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:35:15.850654 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.850618 2580 generic.go:358] "Generic (PLEG): container finished" podID="f19b3998-85b3-40b4-89f4-c66414780c36" containerID="77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43" exitCode=0 Apr 17 17:35:15.851016 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.850689 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" event={"ID":"f19b3998-85b3-40b4-89f4-c66414780c36","Type":"ContainerDied","Data":"77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43"} Apr 17 17:35:15.851016 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.850706 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" Apr 17 17:35:15.851016 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.850724 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2" event={"ID":"f19b3998-85b3-40b4-89f4-c66414780c36","Type":"ContainerDied","Data":"ab2cbc03d8ddbcd9aaddbc15dd20e3d2527e3019756d309e1fdbe3edd21209c5"} Apr 17 17:35:15.851016 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.850740 2580 scope.go:117] "RemoveContainer" containerID="45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179" Apr 17 17:35:15.860057 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.860040 2580 scope.go:117] "RemoveContainer" containerID="77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43" Apr 17 17:35:15.867688 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.867665 2580 scope.go:117] "RemoveContainer" containerID="f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27" Apr 17 17:35:15.873631 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.873605 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2"] Apr 17 17:35:15.875026 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.875011 2580 scope.go:117] "RemoveContainer" containerID="45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179" Apr 17 17:35:15.875281 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:35:15.875258 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179\": container with ID starting with 45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179 not found: ID does not exist" containerID="45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179" Apr 17 17:35:15.875338 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.875289 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179"} err="failed to get container status \"45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179\": rpc error: code = NotFound desc = could not find container \"45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179\": container with ID starting with 45ace0317bf2d9d8c348fc4a9ad6845f5febd164684381c4ba70a9120164c179 not found: ID does not exist" Apr 17 17:35:15.875338 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.875307 2580 scope.go:117] "RemoveContainer" containerID="77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43" Apr 17 17:35:15.875621 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:35:15.875593 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43\": container with ID starting with 77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43 not found: ID does not exist" containerID="77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43" Apr 17 17:35:15.875741 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.875629 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43"} err="failed to get container status \"77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43\": rpc error: code = NotFound desc = could not find container \"77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43\": container with ID starting with 77d4eb01f894ec30ca79cf5e6b9e289f98913d155e203ef06532c313cceaae43 not found: ID does not exist" Apr 17 17:35:15.875741 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.875651 2580 scope.go:117] "RemoveContainer" containerID="f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27" Apr 17 17:35:15.876089 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:35:15.876069 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27\": container with ID starting with f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27 not found: ID does not exist" containerID="f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27" Apr 17 17:35:15.876168 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.876096 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27"} err="failed to get container status \"f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27\": rpc error: code = NotFound desc = could not find container \"f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27\": container with ID starting with f11c0e9a39e03b0c3c37f08897061d503481101e10ecdaaf63cf3840958f0d27 not found: ID does not exist" Apr 17 17:35:15.877523 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:15.877505 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jpmw2"] Apr 17 17:35:16.334906 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.334882 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:35:16.411305 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.411222 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf2da75f-8df0-44f2-8533-584f97edbb63-kserve-provision-location\") pod \"bf2da75f-8df0-44f2-8533-584f97edbb63\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " Apr 17 17:35:16.411305 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.411267 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z6vz\" (UniqueName: \"kubernetes.io/projected/bf2da75f-8df0-44f2-8533-584f97edbb63-kube-api-access-4z6vz\") pod \"bf2da75f-8df0-44f2-8533-584f97edbb63\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " Apr 17 17:35:16.411305 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.411297 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf2da75f-8df0-44f2-8533-584f97edbb63-proxy-tls\") pod \"bf2da75f-8df0-44f2-8533-584f97edbb63\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " Apr 17 17:35:16.411513 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.411331 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf2da75f-8df0-44f2-8533-584f97edbb63-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"bf2da75f-8df0-44f2-8533-584f97edbb63\" (UID: \"bf2da75f-8df0-44f2-8533-584f97edbb63\") " Apr 17 17:35:16.411711 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.411685 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf2da75f-8df0-44f2-8533-584f97edbb63-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bf2da75f-8df0-44f2-8533-584f97edbb63" (UID: "bf2da75f-8df0-44f2-8533-584f97edbb63"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:35:16.411785 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.411738 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf2da75f-8df0-44f2-8533-584f97edbb63-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "bf2da75f-8df0-44f2-8533-584f97edbb63" (UID: "bf2da75f-8df0-44f2-8533-584f97edbb63"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:35:16.413407 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.413378 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf2da75f-8df0-44f2-8533-584f97edbb63-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bf2da75f-8df0-44f2-8533-584f97edbb63" (UID: "bf2da75f-8df0-44f2-8533-584f97edbb63"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:35:16.413523 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.413409 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2da75f-8df0-44f2-8533-584f97edbb63-kube-api-access-4z6vz" (OuterVolumeSpecName: "kube-api-access-4z6vz") pod "bf2da75f-8df0-44f2-8533-584f97edbb63" (UID: "bf2da75f-8df0-44f2-8533-584f97edbb63"). InnerVolumeSpecName "kube-api-access-4z6vz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:35:16.512496 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.512459 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf2da75f-8df0-44f2-8533-584f97edbb63-kserve-provision-location\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:35:16.512496 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.512488 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4z6vz\" (UniqueName: \"kubernetes.io/projected/bf2da75f-8df0-44f2-8533-584f97edbb63-kube-api-access-4z6vz\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:35:16.512496 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.512499 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf2da75f-8df0-44f2-8533-584f97edbb63-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:35:16.512756 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.512512 2580 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf2da75f-8df0-44f2-8533-584f97edbb63-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:35:16.641407 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.641371 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" path="/var/lib/kubelet/pods/f19b3998-85b3-40b4-89f4-c66414780c36/volumes" Apr 17 17:35:16.858354 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.858312 2580 generic.go:358] "Generic (PLEG): container finished" podID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerID="7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556" exitCode=0 Apr 17 17:35:16.858829 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.858499 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" event={"ID":"bf2da75f-8df0-44f2-8533-584f97edbb63","Type":"ContainerDied","Data":"7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556"} Apr 17 17:35:16.858829 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.858542 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" event={"ID":"bf2da75f-8df0-44f2-8533-584f97edbb63","Type":"ContainerDied","Data":"78f256df82feef9c866b7eb98ef2332997ac09724afdcdd1053f1a87b8e2bbc3"} Apr 17 17:35:16.858829 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.858563 2580 scope.go:117] "RemoveContainer" containerID="859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c" Apr 17 17:35:16.858829 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.858825 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq" Apr 17 17:35:16.868266 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.868247 2580 scope.go:117] "RemoveContainer" containerID="7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556" Apr 17 17:35:16.875800 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.875777 2580 scope.go:117] "RemoveContainer" containerID="45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62" Apr 17 17:35:16.877414 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.877393 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq"] Apr 17 17:35:16.881047 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.881027 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-x2rsq"] Apr 17 17:35:16.884100 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.884077 2580 scope.go:117] "RemoveContainer" containerID="859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c" Apr 17 17:35:16.884404 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:35:16.884383 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c\": container with ID starting with 859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c not found: ID does not exist" containerID="859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c" Apr 17 17:35:16.884467 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.884415 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c"} err="failed to get container status \"859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c\": rpc error: code = NotFound desc = could not find container \"859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c\": container with ID starting with 859bc1cdb4bc91b1f773beaf8e9bf0aa9f7a6844125ea158319dbd7657d1719c not found: ID does not exist" Apr 17 17:35:16.884467 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.884436 2580 scope.go:117] "RemoveContainer" containerID="7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556" Apr 17 17:35:16.884698 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:35:16.884679 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556\": container with ID starting with 7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556 not found: ID does not exist" containerID="7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556" Apr 17 17:35:16.884747 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.884704 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556"} err="failed to get container status \"7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556\": rpc error: code = NotFound desc = could not find container \"7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556\": container with ID starting with 7b7d482f1452b70ff137b39def85995abd64c67b68094463fde13bfea206f556 not found: ID does not exist" Apr 17 17:35:16.884747 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.884722 2580 scope.go:117] "RemoveContainer" containerID="45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62" Apr 17 17:35:16.884951 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:35:16.884936 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62\": container with ID starting with 45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62 not found: ID does not exist" containerID="45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62" Apr 17 17:35:16.884986 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:16.884955 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62"} err="failed to get container status \"45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62\": rpc error: code = NotFound desc = could not find container \"45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62\": container with ID starting with 45ac0eec520d953ed5f7bc21016f0373815d08078d1e59f2a89824c86fa60b62 not found: ID does not exist" Apr 17 17:35:18.641133 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:18.641097 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" path="/var/lib/kubelet/pods/bf2da75f-8df0-44f2-8533-584f97edbb63/volumes" Apr 17 17:35:19.850250 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:19.850222 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:35:19.850768 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:19.850744 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 17:35:29.850804 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:29.850760 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 17:35:39.851048 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:39.851005 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 17:35:49.851104 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:49.851021 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 17 17:35:59.851720 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:35:59.851690 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:38:54.584025 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:38:54.583989 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:38:54.589139 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:38:54.589120 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:43:36.680376 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.680290 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx"] Apr 17 17:43:36.680991 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.680666 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kserve-container" containerID="cri-o://1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e" gracePeriod=30 Apr 17 17:43:36.680991 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.680714 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kube-rbac-proxy" containerID="cri-o://db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc" gracePeriod=30 Apr 17 17:43:36.759997 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.759964 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b"] Apr 17 17:43:36.760430 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760412 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" Apr 17 17:43:36.760430 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760431 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" Apr 17 17:43:36.760610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760449 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="storage-initializer" Apr 17 17:43:36.760610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760457 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="storage-initializer" Apr 17 17:43:36.760610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760473 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="storage-initializer" Apr 17 17:43:36.760610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760483 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="storage-initializer" Apr 17 17:43:36.760610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760492 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kube-rbac-proxy" Apr 17 17:43:36.760610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760500 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kube-rbac-proxy" Apr 17 17:43:36.760610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760512 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kube-rbac-proxy" Apr 17 17:43:36.760610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760520 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kube-rbac-proxy" Apr 17 17:43:36.760610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760529 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" Apr 17 17:43:36.760610 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760537 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" Apr 17 17:43:36.761079 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760646 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kserve-container" Apr 17 17:43:36.761079 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760660 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kserve-container" Apr 17 17:43:36.761079 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760674 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="f19b3998-85b3-40b4-89f4-c66414780c36" containerName="kube-rbac-proxy" Apr 17 17:43:36.761079 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.760688 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf2da75f-8df0-44f2-8533-584f97edbb63" containerName="kube-rbac-proxy" Apr 17 17:43:36.763832 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.763811 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:36.766287 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.766264 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d1a8d-predictor-serving-cert\"" Apr 17 17:43:36.766401 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.766297 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-d1a8d-kube-rbac-proxy-sar-config\"" Apr 17 17:43:36.773954 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.773932 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b"] Apr 17 17:43:36.837233 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.837183 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vmrv\" (UniqueName: \"kubernetes.io/projected/82747371-50d2-4c12-8f87-ecc7784c2781-kube-api-access-9vmrv\") pod \"error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:36.837421 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.837355 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82747371-50d2-4c12-8f87-ecc7784c2781-proxy-tls\") pod \"error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:36.837421 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.837411 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-d1a8d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/82747371-50d2-4c12-8f87-ecc7784c2781-error-404-isvc-d1a8d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:36.938803 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.938713 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vmrv\" (UniqueName: \"kubernetes.io/projected/82747371-50d2-4c12-8f87-ecc7784c2781-kube-api-access-9vmrv\") pod \"error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:36.938803 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.938796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82747371-50d2-4c12-8f87-ecc7784c2781-proxy-tls\") pod \"error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:36.939038 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.938835 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-d1a8d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/82747371-50d2-4c12-8f87-ecc7784c2781-error-404-isvc-d1a8d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:36.939038 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:43:36.938961 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-serving-cert: secret "error-404-isvc-d1a8d-predictor-serving-cert" not found Apr 17 17:43:36.939145 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:43:36.939044 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82747371-50d2-4c12-8f87-ecc7784c2781-proxy-tls podName:82747371-50d2-4c12-8f87-ecc7784c2781 nodeName:}" failed. No retries permitted until 2026-04-17 17:43:37.439021553 +0000 UTC m=+1183.410304890 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/82747371-50d2-4c12-8f87-ecc7784c2781-proxy-tls") pod "error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" (UID: "82747371-50d2-4c12-8f87-ecc7784c2781") : secret "error-404-isvc-d1a8d-predictor-serving-cert" not found Apr 17 17:43:36.939445 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.939427 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-d1a8d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/82747371-50d2-4c12-8f87-ecc7784c2781-error-404-isvc-d1a8d-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:36.947958 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:36.947928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vmrv\" (UniqueName: \"kubernetes.io/projected/82747371-50d2-4c12-8f87-ecc7784c2781-kube-api-access-9vmrv\") pod \"error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:37.444230 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:37.444195 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82747371-50d2-4c12-8f87-ecc7784c2781-proxy-tls\") pod \"error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:37.446665 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:37.446633 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82747371-50d2-4c12-8f87-ecc7784c2781-proxy-tls\") pod \"error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:37.550681 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:37.550641 2580 generic.go:358] "Generic (PLEG): container finished" podID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerID="db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc" exitCode=2 Apr 17 17:43:37.550841 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:37.550698 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" event={"ID":"3291e5f9-dbb1-4380-81de-6f318659f1c2","Type":"ContainerDied","Data":"db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc"} Apr 17 17:43:37.676391 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:37.676352 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:37.803213 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:37.803178 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b"] Apr 17 17:43:37.806839 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:43:37.806806 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82747371_50d2_4c12_8f87_ecc7784c2781.slice/crio-8dda1043da0aa2a29976eaed4da569c66fe5974269651dc66f79637e425c5718 WatchSource:0}: Error finding container 8dda1043da0aa2a29976eaed4da569c66fe5974269651dc66f79637e425c5718: Status 404 returned error can't find the container with id 8dda1043da0aa2a29976eaed4da569c66fe5974269651dc66f79637e425c5718 Apr 17 17:43:37.809020 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:37.808998 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:43:38.556024 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:38.555983 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" event={"ID":"82747371-50d2-4c12-8f87-ecc7784c2781","Type":"ContainerStarted","Data":"e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b"} Apr 17 17:43:38.556024 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:38.556025 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" event={"ID":"82747371-50d2-4c12-8f87-ecc7784c2781","Type":"ContainerStarted","Data":"c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e"} Apr 17 17:43:38.556276 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:38.556039 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" event={"ID":"82747371-50d2-4c12-8f87-ecc7784c2781","Type":"ContainerStarted","Data":"8dda1043da0aa2a29976eaed4da569c66fe5974269651dc66f79637e425c5718"} Apr 17 17:43:38.556276 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:38.556166 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:38.556378 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:38.556289 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:38.557375 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:38.557352 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 17 17:43:38.575877 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:38.575829 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" podStartSLOduration=2.575814544 podStartE2EDuration="2.575814544s" podCreationTimestamp="2026-04-17 17:43:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:43:38.5737025 +0000 UTC m=+1184.544985843" watchObservedRunningTime="2026-04-17 17:43:38.575814544 +0000 UTC m=+1184.547097962" Apr 17 17:43:39.559916 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:39.559879 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 17 17:43:39.664332 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:39.664284 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.34:8643/healthz\": dial tcp 10.133.0.34:8643: connect: connection refused" Apr 17 17:43:39.669672 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:39.669631 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 17 17:43:39.920790 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:39.920478 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:43:39.966313 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:39.966284 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-d987b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3291e5f9-dbb1-4380-81de-6f318659f1c2-error-404-isvc-d987b-kube-rbac-proxy-sar-config\") pod \"3291e5f9-dbb1-4380-81de-6f318659f1c2\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " Apr 17 17:43:39.966461 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:39.966338 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3291e5f9-dbb1-4380-81de-6f318659f1c2-proxy-tls\") pod \"3291e5f9-dbb1-4380-81de-6f318659f1c2\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " Apr 17 17:43:39.966461 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:39.966372 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9cwn\" (UniqueName: \"kubernetes.io/projected/3291e5f9-dbb1-4380-81de-6f318659f1c2-kube-api-access-g9cwn\") pod \"3291e5f9-dbb1-4380-81de-6f318659f1c2\" (UID: \"3291e5f9-dbb1-4380-81de-6f318659f1c2\") " Apr 17 17:43:39.966665 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:39.966642 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3291e5f9-dbb1-4380-81de-6f318659f1c2-error-404-isvc-d987b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-d987b-kube-rbac-proxy-sar-config") pod "3291e5f9-dbb1-4380-81de-6f318659f1c2" (UID: "3291e5f9-dbb1-4380-81de-6f318659f1c2"). InnerVolumeSpecName "error-404-isvc-d987b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:43:39.968397 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:39.968370 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3291e5f9-dbb1-4380-81de-6f318659f1c2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3291e5f9-dbb1-4380-81de-6f318659f1c2" (UID: "3291e5f9-dbb1-4380-81de-6f318659f1c2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:43:39.968516 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:39.968438 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3291e5f9-dbb1-4380-81de-6f318659f1c2-kube-api-access-g9cwn" (OuterVolumeSpecName: "kube-api-access-g9cwn") pod "3291e5f9-dbb1-4380-81de-6f318659f1c2" (UID: "3291e5f9-dbb1-4380-81de-6f318659f1c2"). InnerVolumeSpecName "kube-api-access-g9cwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:43:40.067468 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.067438 2580 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-d987b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3291e5f9-dbb1-4380-81de-6f318659f1c2-error-404-isvc-d987b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:43:40.067468 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.067467 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3291e5f9-dbb1-4380-81de-6f318659f1c2-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:43:40.067699 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.067480 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9cwn\" (UniqueName: \"kubernetes.io/projected/3291e5f9-dbb1-4380-81de-6f318659f1c2-kube-api-access-g9cwn\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:43:40.565083 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.565050 2580 generic.go:358] "Generic (PLEG): container finished" podID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerID="1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e" exitCode=0 Apr 17 17:43:40.565490 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.565120 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" Apr 17 17:43:40.565490 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.565131 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" event={"ID":"3291e5f9-dbb1-4380-81de-6f318659f1c2","Type":"ContainerDied","Data":"1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e"} Apr 17 17:43:40.565490 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.565170 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx" event={"ID":"3291e5f9-dbb1-4380-81de-6f318659f1c2","Type":"ContainerDied","Data":"7466b94d8028c46e48e408013844c15ea218c5b0732150d1e4d6f4e7bf070306"} Apr 17 17:43:40.565490 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.565186 2580 scope.go:117] "RemoveContainer" containerID="db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc" Apr 17 17:43:40.573544 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.573522 2580 scope.go:117] "RemoveContainer" containerID="1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e" Apr 17 17:43:40.580791 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.580774 2580 scope.go:117] "RemoveContainer" containerID="db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc" Apr 17 17:43:40.581073 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:43:40.581055 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc\": container with ID starting with db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc not found: ID does not exist" containerID="db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc" Apr 17 17:43:40.581127 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.581082 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc"} err="failed to get container status \"db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc\": rpc error: code = NotFound desc = could not find container \"db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc\": container with ID starting with db08403b3025565f093a133651c07f922e748e3dd319dea4b67e3827e99879bc not found: ID does not exist" Apr 17 17:43:40.581127 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.581103 2580 scope.go:117] "RemoveContainer" containerID="1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e" Apr 17 17:43:40.581339 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:43:40.581321 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e\": container with ID starting with 1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e not found: ID does not exist" containerID="1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e" Apr 17 17:43:40.581381 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.581345 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e"} err="failed to get container status \"1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e\": rpc error: code = NotFound desc = could not find container \"1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e\": container with ID starting with 1d1bc79f3eef60913842e9ee278382035b130655d69688bc17f29125a36e637e not found: ID does not exist" Apr 17 17:43:40.590014 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.589990 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx"] Apr 17 17:43:40.593771 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.593748 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d987b-predictor-5666cb86b9-jz6hx"] Apr 17 17:43:40.640944 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:40.640915 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" path="/var/lib/kubelet/pods/3291e5f9-dbb1-4380-81de-6f318659f1c2/volumes" Apr 17 17:43:44.564519 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:44.564489 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:43:44.564993 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:44.564969 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 17 17:43:54.564983 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:54.564934 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 17 17:43:54.609977 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:54.609949 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:43:54.617475 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:43:54.617450 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:44:04.564945 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:04.564897 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 17 17:44:14.565000 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:14.564961 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 17 17:44:24.565725 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:24.565696 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:44:26.393388 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.393357 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7"] Apr 17 17:44:26.393824 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.393650 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kserve-container" containerID="cri-o://fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0" gracePeriod=30 Apr 17 17:44:26.393824 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.393702 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kube-rbac-proxy" containerID="cri-o://bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f" gracePeriod=30 Apr 17 17:44:26.460852 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.460819 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp"] Apr 17 17:44:26.463611 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.461934 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kserve-container" Apr 17 17:44:26.463611 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.461967 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kserve-container" Apr 17 17:44:26.463611 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.462027 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kube-rbac-proxy" Apr 17 17:44:26.463611 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.462036 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kube-rbac-proxy" Apr 17 17:44:26.463611 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.462188 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kube-rbac-proxy" Apr 17 17:44:26.463611 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.462200 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="3291e5f9-dbb1-4380-81de-6f318659f1c2" containerName="kserve-container" Apr 17 17:44:26.470214 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.470185 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:26.473261 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.473228 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a1277-predictor-serving-cert\"" Apr 17 17:44:26.473410 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.473231 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a1277-kube-rbac-proxy-sar-config\"" Apr 17 17:44:26.475536 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.475511 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp"] Apr 17 17:44:26.559841 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.559789 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-a1277-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-error-404-isvc-a1277-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a1277-predictor-8558bb5997-nxrxp\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:26.560007 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.559847 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-proxy-tls\") pod \"error-404-isvc-a1277-predictor-8558bb5997-nxrxp\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:26.560007 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.559928 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c85f9\" (UniqueName: \"kubernetes.io/projected/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-kube-api-access-c85f9\") pod \"error-404-isvc-a1277-predictor-8558bb5997-nxrxp\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:26.660953 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.660853 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-a1277-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-error-404-isvc-a1277-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a1277-predictor-8558bb5997-nxrxp\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:26.660953 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.660903 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-proxy-tls\") pod \"error-404-isvc-a1277-predictor-8558bb5997-nxrxp\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:26.660953 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.660946 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c85f9\" (UniqueName: \"kubernetes.io/projected/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-kube-api-access-c85f9\") pod \"error-404-isvc-a1277-predictor-8558bb5997-nxrxp\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:26.661239 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:44:26.661013 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-a1277-predictor-serving-cert: secret "error-404-isvc-a1277-predictor-serving-cert" not found Apr 17 17:44:26.661239 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:44:26.661097 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-proxy-tls podName:b8534ca9-5882-4a61-8cc9-35cddd3eb42b nodeName:}" failed. No retries permitted until 2026-04-17 17:44:27.161073071 +0000 UTC m=+1233.132356403 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-proxy-tls") pod "error-404-isvc-a1277-predictor-8558bb5997-nxrxp" (UID: "b8534ca9-5882-4a61-8cc9-35cddd3eb42b") : secret "error-404-isvc-a1277-predictor-serving-cert" not found Apr 17 17:44:26.661571 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.661550 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-a1277-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-error-404-isvc-a1277-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a1277-predictor-8558bb5997-nxrxp\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:26.669949 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.669928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c85f9\" (UniqueName: \"kubernetes.io/projected/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-kube-api-access-c85f9\") pod \"error-404-isvc-a1277-predictor-8558bb5997-nxrxp\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:26.718340 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.718303 2580 generic.go:358] "Generic (PLEG): container finished" podID="53938468-83ca-43fd-8056-3e474d3956a9" containerID="bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f" exitCode=2 Apr 17 17:44:26.718537 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:26.718375 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" event={"ID":"53938468-83ca-43fd-8056-3e474d3956a9","Type":"ContainerDied","Data":"bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f"} Apr 17 17:44:27.164135 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:27.164099 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-proxy-tls\") pod \"error-404-isvc-a1277-predictor-8558bb5997-nxrxp\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:27.166447 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:27.166415 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-proxy-tls\") pod \"error-404-isvc-a1277-predictor-8558bb5997-nxrxp\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:27.384391 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:27.384356 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:27.516140 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:27.516106 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp"] Apr 17 17:44:27.519282 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:44:27.519252 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8534ca9_5882_4a61_8cc9_35cddd3eb42b.slice/crio-3c30f142cd4e58a1fa4383cf6414f08a5771fb4ab1283c073104b57a096acbac WatchSource:0}: Error finding container 3c30f142cd4e58a1fa4383cf6414f08a5771fb4ab1283c073104b57a096acbac: Status 404 returned error can't find the container with id 3c30f142cd4e58a1fa4383cf6414f08a5771fb4ab1283c073104b57a096acbac Apr 17 17:44:27.723874 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:27.723784 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" event={"ID":"b8534ca9-5882-4a61-8cc9-35cddd3eb42b","Type":"ContainerStarted","Data":"dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0"} Apr 17 17:44:27.723874 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:27.723822 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" event={"ID":"b8534ca9-5882-4a61-8cc9-35cddd3eb42b","Type":"ContainerStarted","Data":"2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615"} Apr 17 17:44:27.723874 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:27.723831 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" event={"ID":"b8534ca9-5882-4a61-8cc9-35cddd3eb42b","Type":"ContainerStarted","Data":"3c30f142cd4e58a1fa4383cf6414f08a5771fb4ab1283c073104b57a096acbac"} Apr 17 17:44:27.724082 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:27.723913 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:27.746196 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:27.746141 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" podStartSLOduration=1.746125205 podStartE2EDuration="1.746125205s" podCreationTimestamp="2026-04-17 17:44:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:44:27.743738053 +0000 UTC m=+1233.715021408" watchObservedRunningTime="2026-04-17 17:44:27.746125205 +0000 UTC m=+1233.717408547" Apr 17 17:44:28.727806 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:28.727773 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:28.729082 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:28.729053 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:44:29.642212 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.642190 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:44:29.684135 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.684057 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53938468-83ca-43fd-8056-3e474d3956a9-proxy-tls\") pod \"53938468-83ca-43fd-8056-3e474d3956a9\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " Apr 17 17:44:29.684292 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.684161 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-fa93d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/53938468-83ca-43fd-8056-3e474d3956a9-error-404-isvc-fa93d-kube-rbac-proxy-sar-config\") pod \"53938468-83ca-43fd-8056-3e474d3956a9\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " Apr 17 17:44:29.684292 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.684195 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnfgt\" (UniqueName: \"kubernetes.io/projected/53938468-83ca-43fd-8056-3e474d3956a9-kube-api-access-dnfgt\") pod \"53938468-83ca-43fd-8056-3e474d3956a9\" (UID: \"53938468-83ca-43fd-8056-3e474d3956a9\") " Apr 17 17:44:29.684522 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.684496 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53938468-83ca-43fd-8056-3e474d3956a9-error-404-isvc-fa93d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-fa93d-kube-rbac-proxy-sar-config") pod "53938468-83ca-43fd-8056-3e474d3956a9" (UID: "53938468-83ca-43fd-8056-3e474d3956a9"). InnerVolumeSpecName "error-404-isvc-fa93d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:44:29.686223 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.686199 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53938468-83ca-43fd-8056-3e474d3956a9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "53938468-83ca-43fd-8056-3e474d3956a9" (UID: "53938468-83ca-43fd-8056-3e474d3956a9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:44:29.686350 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.686330 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53938468-83ca-43fd-8056-3e474d3956a9-kube-api-access-dnfgt" (OuterVolumeSpecName: "kube-api-access-dnfgt") pod "53938468-83ca-43fd-8056-3e474d3956a9" (UID: "53938468-83ca-43fd-8056-3e474d3956a9"). InnerVolumeSpecName "kube-api-access-dnfgt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:44:29.732186 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.732153 2580 generic.go:358] "Generic (PLEG): container finished" podID="53938468-83ca-43fd-8056-3e474d3956a9" containerID="fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0" exitCode=0 Apr 17 17:44:29.732625 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.732224 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" Apr 17 17:44:29.732625 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.732239 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" event={"ID":"53938468-83ca-43fd-8056-3e474d3956a9","Type":"ContainerDied","Data":"fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0"} Apr 17 17:44:29.732625 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.732277 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7" event={"ID":"53938468-83ca-43fd-8056-3e474d3956a9","Type":"ContainerDied","Data":"50bea3ee078afe25fb97080822a8a42c99a5ab8798c51bc17823f89f82cd261c"} Apr 17 17:44:29.732625 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.732299 2580 scope.go:117] "RemoveContainer" containerID="bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f" Apr 17 17:44:29.732854 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.732730 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:44:29.741119 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.741100 2580 scope.go:117] "RemoveContainer" containerID="fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0" Apr 17 17:44:29.748509 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.748489 2580 scope.go:117] "RemoveContainer" containerID="bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f" Apr 17 17:44:29.748798 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:44:29.748777 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f\": container with ID starting with bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f not found: ID does not exist" containerID="bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f" Apr 17 17:44:29.748866 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.748809 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f"} err="failed to get container status \"bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f\": rpc error: code = NotFound desc = could not find container \"bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f\": container with ID starting with bba5199d84f769c648829ac44898788975c3fe1b23944d201fac6cdaac50949f not found: ID does not exist" Apr 17 17:44:29.748866 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.748829 2580 scope.go:117] "RemoveContainer" containerID="fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0" Apr 17 17:44:29.749064 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:44:29.749046 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0\": container with ID starting with fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0 not found: ID does not exist" containerID="fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0" Apr 17 17:44:29.749104 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.749068 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0"} err="failed to get container status \"fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0\": rpc error: code = NotFound desc = could not find container \"fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0\": container with ID starting with fa5f96d5467e47b7775006dfa365e14497a1d9dc796398699b82fdea51fa12d0 not found: ID does not exist" Apr 17 17:44:29.753083 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.753062 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7"] Apr 17 17:44:29.757105 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.757087 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fa93d-predictor-6ff7867c59-kgth7"] Apr 17 17:44:29.784906 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.784870 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53938468-83ca-43fd-8056-3e474d3956a9-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:44:29.784906 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.784903 2580 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-fa93d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/53938468-83ca-43fd-8056-3e474d3956a9-error-404-isvc-fa93d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:44:29.785120 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:29.784920 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dnfgt\" (UniqueName: \"kubernetes.io/projected/53938468-83ca-43fd-8056-3e474d3956a9-kube-api-access-dnfgt\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:44:30.641648 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:30.641611 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53938468-83ca-43fd-8056-3e474d3956a9" path="/var/lib/kubelet/pods/53938468-83ca-43fd-8056-3e474d3956a9/volumes" Apr 17 17:44:34.737184 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:34.737154 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:44:34.737745 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:34.737718 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:44:44.738437 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:44.738357 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:44:47.034793 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.034750 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b"] Apr 17 17:44:47.035329 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.035039 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kserve-container" containerID="cri-o://c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e" gracePeriod=30 Apr 17 17:44:47.035329 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.035059 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kube-rbac-proxy" containerID="cri-o://e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b" gracePeriod=30 Apr 17 17:44:47.121457 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.121420 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx"] Apr 17 17:44:47.121826 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.121810 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kserve-container" Apr 17 17:44:47.121826 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.121827 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kserve-container" Apr 17 17:44:47.121942 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.121836 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kube-rbac-proxy" Apr 17 17:44:47.121942 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.121841 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kube-rbac-proxy" Apr 17 17:44:47.121942 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.121902 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kube-rbac-proxy" Apr 17 17:44:47.121942 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.121912 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="53938468-83ca-43fd-8056-3e474d3956a9" containerName="kserve-container" Apr 17 17:44:47.126409 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.126376 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.130551 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.130526 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-313c7-predictor-serving-cert\"" Apr 17 17:44:47.130551 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.130546 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-313c7-kube-rbac-proxy-sar-config\"" Apr 17 17:44:47.144822 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.144798 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx"] Apr 17 17:44:47.236414 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.236374 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e4155761-7757-439a-a6ab-446a33381869-proxy-tls\") pod \"error-404-isvc-313c7-predictor-5555fd9c67-rqprx\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.236653 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.236451 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-313c7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e4155761-7757-439a-a6ab-446a33381869-error-404-isvc-313c7-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-313c7-predictor-5555fd9c67-rqprx\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.236653 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.236496 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f6qf\" (UniqueName: \"kubernetes.io/projected/e4155761-7757-439a-a6ab-446a33381869-kube-api-access-8f6qf\") pod \"error-404-isvc-313c7-predictor-5555fd9c67-rqprx\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.337248 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.337220 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8f6qf\" (UniqueName: \"kubernetes.io/projected/e4155761-7757-439a-a6ab-446a33381869-kube-api-access-8f6qf\") pod \"error-404-isvc-313c7-predictor-5555fd9c67-rqprx\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.337438 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.337288 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e4155761-7757-439a-a6ab-446a33381869-proxy-tls\") pod \"error-404-isvc-313c7-predictor-5555fd9c67-rqprx\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.337438 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.337387 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-313c7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e4155761-7757-439a-a6ab-446a33381869-error-404-isvc-313c7-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-313c7-predictor-5555fd9c67-rqprx\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.338071 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.338048 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-313c7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e4155761-7757-439a-a6ab-446a33381869-error-404-isvc-313c7-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-313c7-predictor-5555fd9c67-rqprx\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.339675 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.339643 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e4155761-7757-439a-a6ab-446a33381869-proxy-tls\") pod \"error-404-isvc-313c7-predictor-5555fd9c67-rqprx\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.346231 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.346204 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f6qf\" (UniqueName: \"kubernetes.io/projected/e4155761-7757-439a-a6ab-446a33381869-kube-api-access-8f6qf\") pod \"error-404-isvc-313c7-predictor-5555fd9c67-rqprx\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.436573 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.436535 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.560002 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.559973 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx"] Apr 17 17:44:47.562590 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:44:47.562547 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4155761_7757_439a_a6ab_446a33381869.slice/crio-be7c764d351aab767a37da3a387715eee08923d3703af19936d7ae4710af94e3 WatchSource:0}: Error finding container be7c764d351aab767a37da3a387715eee08923d3703af19936d7ae4710af94e3: Status 404 returned error can't find the container with id be7c764d351aab767a37da3a387715eee08923d3703af19936d7ae4710af94e3 Apr 17 17:44:47.800970 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.800890 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" event={"ID":"e4155761-7757-439a-a6ab-446a33381869","Type":"ContainerStarted","Data":"a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a"} Apr 17 17:44:47.800970 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.800935 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" event={"ID":"e4155761-7757-439a-a6ab-446a33381869","Type":"ContainerStarted","Data":"30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa"} Apr 17 17:44:47.801300 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.801250 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:47.801300 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.801280 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" event={"ID":"e4155761-7757-439a-a6ab-446a33381869","Type":"ContainerStarted","Data":"be7c764d351aab767a37da3a387715eee08923d3703af19936d7ae4710af94e3"} Apr 17 17:44:47.803511 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.803480 2580 generic.go:358] "Generic (PLEG): container finished" podID="82747371-50d2-4c12-8f87-ecc7784c2781" containerID="e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b" exitCode=2 Apr 17 17:44:47.803661 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.803543 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" event={"ID":"82747371-50d2-4c12-8f87-ecc7784c2781","Type":"ContainerDied","Data":"e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b"} Apr 17 17:44:47.819634 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:47.819569 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" podStartSLOduration=0.819552152 podStartE2EDuration="819.552152ms" podCreationTimestamp="2026-04-17 17:44:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:44:47.818407703 +0000 UTC m=+1253.789691071" watchObservedRunningTime="2026-04-17 17:44:47.819552152 +0000 UTC m=+1253.790835471" Apr 17 17:44:48.808974 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:48.808941 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:48.810187 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:48.810161 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:44:49.560687 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:49.560645 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 17 17:44:49.812892 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:49.812792 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:44:50.281858 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.281835 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:44:50.365287 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.365185 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-d1a8d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/82747371-50d2-4c12-8f87-ecc7784c2781-error-404-isvc-d1a8d-kube-rbac-proxy-sar-config\") pod \"82747371-50d2-4c12-8f87-ecc7784c2781\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " Apr 17 17:44:50.365465 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.365308 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82747371-50d2-4c12-8f87-ecc7784c2781-proxy-tls\") pod \"82747371-50d2-4c12-8f87-ecc7784c2781\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " Apr 17 17:44:50.365465 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.365332 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vmrv\" (UniqueName: \"kubernetes.io/projected/82747371-50d2-4c12-8f87-ecc7784c2781-kube-api-access-9vmrv\") pod \"82747371-50d2-4c12-8f87-ecc7784c2781\" (UID: \"82747371-50d2-4c12-8f87-ecc7784c2781\") " Apr 17 17:44:50.365671 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.365646 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82747371-50d2-4c12-8f87-ecc7784c2781-error-404-isvc-d1a8d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-d1a8d-kube-rbac-proxy-sar-config") pod "82747371-50d2-4c12-8f87-ecc7784c2781" (UID: "82747371-50d2-4c12-8f87-ecc7784c2781"). InnerVolumeSpecName "error-404-isvc-d1a8d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:44:50.367339 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.367318 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82747371-50d2-4c12-8f87-ecc7784c2781-kube-api-access-9vmrv" (OuterVolumeSpecName: "kube-api-access-9vmrv") pod "82747371-50d2-4c12-8f87-ecc7784c2781" (UID: "82747371-50d2-4c12-8f87-ecc7784c2781"). InnerVolumeSpecName "kube-api-access-9vmrv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:44:50.367434 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.367417 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82747371-50d2-4c12-8f87-ecc7784c2781-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "82747371-50d2-4c12-8f87-ecc7784c2781" (UID: "82747371-50d2-4c12-8f87-ecc7784c2781"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:44:50.470606 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.466823 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82747371-50d2-4c12-8f87-ecc7784c2781-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:44:50.470606 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.466861 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vmrv\" (UniqueName: \"kubernetes.io/projected/82747371-50d2-4c12-8f87-ecc7784c2781-kube-api-access-9vmrv\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:44:50.470606 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.466878 2580 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-d1a8d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/82747371-50d2-4c12-8f87-ecc7784c2781-error-404-isvc-d1a8d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:44:50.816772 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.816737 2580 generic.go:358] "Generic (PLEG): container finished" podID="82747371-50d2-4c12-8f87-ecc7784c2781" containerID="c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e" exitCode=0 Apr 17 17:44:50.817241 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.816811 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" event={"ID":"82747371-50d2-4c12-8f87-ecc7784c2781","Type":"ContainerDied","Data":"c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e"} Apr 17 17:44:50.817241 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.816837 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" Apr 17 17:44:50.817241 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.816851 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b" event={"ID":"82747371-50d2-4c12-8f87-ecc7784c2781","Type":"ContainerDied","Data":"8dda1043da0aa2a29976eaed4da569c66fe5974269651dc66f79637e425c5718"} Apr 17 17:44:50.817241 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.816867 2580 scope.go:117] "RemoveContainer" containerID="e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b" Apr 17 17:44:50.825065 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.825044 2580 scope.go:117] "RemoveContainer" containerID="c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e" Apr 17 17:44:50.832362 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.832341 2580 scope.go:117] "RemoveContainer" containerID="e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b" Apr 17 17:44:50.832629 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:44:50.832606 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b\": container with ID starting with e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b not found: ID does not exist" containerID="e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b" Apr 17 17:44:50.832698 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.832637 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b"} err="failed to get container status \"e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b\": rpc error: code = NotFound desc = could not find container \"e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b\": container with ID starting with e5e1e14f4d1a6c854c11712436e0048c38e16405aaf64dda992e07f94d875f2b not found: ID does not exist" Apr 17 17:44:50.832698 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.832656 2580 scope.go:117] "RemoveContainer" containerID="c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e" Apr 17 17:44:50.832917 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:44:50.832899 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e\": container with ID starting with c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e not found: ID does not exist" containerID="c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e" Apr 17 17:44:50.832964 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.832927 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e"} err="failed to get container status \"c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e\": rpc error: code = NotFound desc = could not find container \"c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e\": container with ID starting with c7dd43566930d7d0d332e87e4f0168b06ffe15097257d4efe94716b0cc73f78e not found: ID does not exist" Apr 17 17:44:50.838753 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.838730 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b"] Apr 17 17:44:50.842966 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:50.842945 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-d1a8d-predictor-8544d9ff69-9bp5b"] Apr 17 17:44:52.640763 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:52.640731 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" path="/var/lib/kubelet/pods/82747371-50d2-4c12-8f87-ecc7784c2781/volumes" Apr 17 17:44:54.737704 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:54.737668 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:44:54.817351 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:54.817317 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:44:54.818002 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:44:54.817956 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:45:04.738600 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:04.738540 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 17 17:45:04.818788 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:04.818744 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:45:14.738662 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:14.738625 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:45:14.818195 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:14.818161 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:45:24.818076 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:24.818031 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:45:34.819340 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:34.819304 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:45:36.581951 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.581914 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp"] Apr 17 17:45:36.582440 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.582316 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kserve-container" containerID="cri-o://2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615" gracePeriod=30 Apr 17 17:45:36.582440 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.582363 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kube-rbac-proxy" containerID="cri-o://dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0" gracePeriod=30 Apr 17 17:45:36.642809 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.642777 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4"] Apr 17 17:45:36.643243 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.643225 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kserve-container" Apr 17 17:45:36.643343 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.643245 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kserve-container" Apr 17 17:45:36.643343 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.643256 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kube-rbac-proxy" Apr 17 17:45:36.643343 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.643264 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kube-rbac-proxy" Apr 17 17:45:36.643343 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.643337 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kserve-container" Apr 17 17:45:36.643559 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.643356 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="82747371-50d2-4c12-8f87-ecc7784c2781" containerName="kube-rbac-proxy" Apr 17 17:45:36.646660 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.646638 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:36.649205 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.649176 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-dd512-predictor-serving-cert\"" Apr 17 17:45:36.649326 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.649269 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-dd512-kube-rbac-proxy-sar-config\"" Apr 17 17:45:36.657808 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.657786 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4"] Apr 17 17:45:36.746936 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.746899 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-dd512-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4664528d-dfcc-4452-8664-0c3273aca6cc-error-404-isvc-dd512-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-dd512-predictor-844bd6fb77-gxzv4\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:36.746936 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.746939 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsjhv\" (UniqueName: \"kubernetes.io/projected/4664528d-dfcc-4452-8664-0c3273aca6cc-kube-api-access-dsjhv\") pod \"error-404-isvc-dd512-predictor-844bd6fb77-gxzv4\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:36.747188 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.747092 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4664528d-dfcc-4452-8664-0c3273aca6cc-proxy-tls\") pod \"error-404-isvc-dd512-predictor-844bd6fb77-gxzv4\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:36.848398 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.848304 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4664528d-dfcc-4452-8664-0c3273aca6cc-proxy-tls\") pod \"error-404-isvc-dd512-predictor-844bd6fb77-gxzv4\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:36.848398 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.848353 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-dd512-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4664528d-dfcc-4452-8664-0c3273aca6cc-error-404-isvc-dd512-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-dd512-predictor-844bd6fb77-gxzv4\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:36.848669 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.848479 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsjhv\" (UniqueName: \"kubernetes.io/projected/4664528d-dfcc-4452-8664-0c3273aca6cc-kube-api-access-dsjhv\") pod \"error-404-isvc-dd512-predictor-844bd6fb77-gxzv4\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:36.849075 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.849056 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-dd512-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4664528d-dfcc-4452-8664-0c3273aca6cc-error-404-isvc-dd512-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-dd512-predictor-844bd6fb77-gxzv4\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:36.850718 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.850695 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4664528d-dfcc-4452-8664-0c3273aca6cc-proxy-tls\") pod \"error-404-isvc-dd512-predictor-844bd6fb77-gxzv4\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:36.857705 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.857678 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsjhv\" (UniqueName: \"kubernetes.io/projected/4664528d-dfcc-4452-8664-0c3273aca6cc-kube-api-access-dsjhv\") pod \"error-404-isvc-dd512-predictor-844bd6fb77-gxzv4\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:36.957826 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.957784 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:36.976471 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.976432 2580 generic.go:358] "Generic (PLEG): container finished" podID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerID="dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0" exitCode=2 Apr 17 17:45:36.976634 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:36.976480 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" event={"ID":"b8534ca9-5882-4a61-8cc9-35cddd3eb42b","Type":"ContainerDied","Data":"dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0"} Apr 17 17:45:37.079183 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:37.079139 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4"] Apr 17 17:45:37.081845 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:45:37.081815 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4664528d_dfcc_4452_8664_0c3273aca6cc.slice/crio-32876f66101b8d709fab3b44e87cd61f6a7322c17085022dbffdeee75c862822 WatchSource:0}: Error finding container 32876f66101b8d709fab3b44e87cd61f6a7322c17085022dbffdeee75c862822: Status 404 returned error can't find the container with id 32876f66101b8d709fab3b44e87cd61f6a7322c17085022dbffdeee75c862822 Apr 17 17:45:37.981552 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:37.981512 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" event={"ID":"4664528d-dfcc-4452-8664-0c3273aca6cc","Type":"ContainerStarted","Data":"cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f"} Apr 17 17:45:37.981552 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:37.981548 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" event={"ID":"4664528d-dfcc-4452-8664-0c3273aca6cc","Type":"ContainerStarted","Data":"d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec"} Apr 17 17:45:37.981552 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:37.981557 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" event={"ID":"4664528d-dfcc-4452-8664-0c3273aca6cc","Type":"ContainerStarted","Data":"32876f66101b8d709fab3b44e87cd61f6a7322c17085022dbffdeee75c862822"} Apr 17 17:45:37.982043 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:37.981686 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:38.001051 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:38.000992 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" podStartSLOduration=2.000977433 podStartE2EDuration="2.000977433s" podCreationTimestamp="2026-04-17 17:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:45:38.000843477 +0000 UTC m=+1303.972126818" watchObservedRunningTime="2026-04-17 17:45:38.000977433 +0000 UTC m=+1303.972260774" Apr 17 17:45:38.985134 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:38.985101 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:38.986394 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:38.986368 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:45:39.733207 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.733171 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 17 17:45:39.925094 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.925071 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:45:39.974501 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.974412 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-proxy-tls\") pod \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " Apr 17 17:45:39.974501 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.974490 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-a1277-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-error-404-isvc-a1277-kube-rbac-proxy-sar-config\") pod \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " Apr 17 17:45:39.974744 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.974536 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c85f9\" (UniqueName: \"kubernetes.io/projected/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-kube-api-access-c85f9\") pod \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\" (UID: \"b8534ca9-5882-4a61-8cc9-35cddd3eb42b\") " Apr 17 17:45:39.974952 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.974927 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-error-404-isvc-a1277-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-a1277-kube-rbac-proxy-sar-config") pod "b8534ca9-5882-4a61-8cc9-35cddd3eb42b" (UID: "b8534ca9-5882-4a61-8cc9-35cddd3eb42b"). InnerVolumeSpecName "error-404-isvc-a1277-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:45:39.976573 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.976552 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b8534ca9-5882-4a61-8cc9-35cddd3eb42b" (UID: "b8534ca9-5882-4a61-8cc9-35cddd3eb42b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:45:39.976692 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.976672 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-kube-api-access-c85f9" (OuterVolumeSpecName: "kube-api-access-c85f9") pod "b8534ca9-5882-4a61-8cc9-35cddd3eb42b" (UID: "b8534ca9-5882-4a61-8cc9-35cddd3eb42b"). InnerVolumeSpecName "kube-api-access-c85f9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:45:39.990486 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.990455 2580 generic.go:358] "Generic (PLEG): container finished" podID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerID="2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615" exitCode=0 Apr 17 17:45:39.990885 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.990531 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" Apr 17 17:45:39.990885 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.990537 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" event={"ID":"b8534ca9-5882-4a61-8cc9-35cddd3eb42b","Type":"ContainerDied","Data":"2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615"} Apr 17 17:45:39.990885 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.990573 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp" event={"ID":"b8534ca9-5882-4a61-8cc9-35cddd3eb42b","Type":"ContainerDied","Data":"3c30f142cd4e58a1fa4383cf6414f08a5771fb4ab1283c073104b57a096acbac"} Apr 17 17:45:39.990885 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.990605 2580 scope.go:117] "RemoveContainer" containerID="dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0" Apr 17 17:45:39.991204 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:39.991060 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:45:40.001853 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:40.001833 2580 scope.go:117] "RemoveContainer" containerID="2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615" Apr 17 17:45:40.009608 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:40.009559 2580 scope.go:117] "RemoveContainer" containerID="dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0" Apr 17 17:45:40.009896 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:45:40.009876 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0\": container with ID starting with dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0 not found: ID does not exist" containerID="dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0" Apr 17 17:45:40.009949 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:40.009906 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0"} err="failed to get container status \"dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0\": rpc error: code = NotFound desc = could not find container \"dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0\": container with ID starting with dbbb97f24769d06d6543196e1b1a6bf3f658016414cb23577a24dfc9aeb80cc0 not found: ID does not exist" Apr 17 17:45:40.009949 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:40.009926 2580 scope.go:117] "RemoveContainer" containerID="2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615" Apr 17 17:45:40.010183 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:45:40.010165 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615\": container with ID starting with 2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615 not found: ID does not exist" containerID="2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615" Apr 17 17:45:40.010254 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:40.010188 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615"} err="failed to get container status \"2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615\": rpc error: code = NotFound desc = could not find container \"2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615\": container with ID starting with 2b43b58083b0e6ab8d375239af3cfbe345994698ebc4d492f3415fc2e0180615 not found: ID does not exist" Apr 17 17:45:40.013924 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:40.013904 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp"] Apr 17 17:45:40.016455 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:40.016434 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a1277-predictor-8558bb5997-nxrxp"] Apr 17 17:45:40.076077 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:40.076044 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:45:40.076077 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:40.076077 2580 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-a1277-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-error-404-isvc-a1277-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:45:40.076329 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:40.076088 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c85f9\" (UniqueName: \"kubernetes.io/projected/b8534ca9-5882-4a61-8cc9-35cddd3eb42b-kube-api-access-c85f9\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:45:40.641618 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:40.641566 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" path="/var/lib/kubelet/pods/b8534ca9-5882-4a61-8cc9-35cddd3eb42b/volumes" Apr 17 17:45:44.995097 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:44.995070 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:45:44.995519 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:44.995468 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:45:54.996251 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:45:54.996160 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:46:04.995551 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:46:04.995510 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:46:14.996001 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:46:14.995911 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 17 17:46:24.996807 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:46:24.996773 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:48:54.633808 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:48:54.633780 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:48:54.646112 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:48:54.646080 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:53:54.658772 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:53:54.658661 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:53:54.672980 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:53:54.672958 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:54:01.832223 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.832189 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx"] Apr 17 17:54:01.832678 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.832460 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kserve-container" containerID="cri-o://30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa" gracePeriod=30 Apr 17 17:54:01.832678 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.832531 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kube-rbac-proxy" containerID="cri-o://a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a" gracePeriod=30 Apr 17 17:54:01.907713 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.907667 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg"] Apr 17 17:54:01.908146 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.908127 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kube-rbac-proxy" Apr 17 17:54:01.908241 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.908148 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kube-rbac-proxy" Apr 17 17:54:01.908241 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.908168 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kserve-container" Apr 17 17:54:01.908241 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.908175 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kserve-container" Apr 17 17:54:01.908397 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.908276 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kserve-container" Apr 17 17:54:01.908397 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.908288 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8534ca9-5882-4a61-8cc9-35cddd3eb42b" containerName="kube-rbac-proxy" Apr 17 17:54:01.911479 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.911456 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:01.913990 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.913960 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e4a99-kube-rbac-proxy-sar-config\"" Apr 17 17:54:01.913990 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.913960 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e4a99-predictor-serving-cert\"" Apr 17 17:54:01.920290 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:01.920263 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg"] Apr 17 17:54:02.031430 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.031387 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57cw6\" (UniqueName: \"kubernetes.io/projected/8f5d59d1-3867-4c2a-b48c-5c7206893353-kube-api-access-57cw6\") pod \"error-404-isvc-e4a99-predictor-f849b4d65-x75zg\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:02.031644 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.031467 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-e4a99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f5d59d1-3867-4c2a-b48c-5c7206893353-error-404-isvc-e4a99-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e4a99-predictor-f849b4d65-x75zg\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:02.031644 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.031534 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f5d59d1-3867-4c2a-b48c-5c7206893353-proxy-tls\") pod \"error-404-isvc-e4a99-predictor-f849b4d65-x75zg\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:02.132316 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.132224 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f5d59d1-3867-4c2a-b48c-5c7206893353-proxy-tls\") pod \"error-404-isvc-e4a99-predictor-f849b4d65-x75zg\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:02.132316 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.132280 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57cw6\" (UniqueName: \"kubernetes.io/projected/8f5d59d1-3867-4c2a-b48c-5c7206893353-kube-api-access-57cw6\") pod \"error-404-isvc-e4a99-predictor-f849b4d65-x75zg\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:02.132537 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.132347 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-e4a99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f5d59d1-3867-4c2a-b48c-5c7206893353-error-404-isvc-e4a99-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e4a99-predictor-f849b4d65-x75zg\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:02.133093 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.133066 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-e4a99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f5d59d1-3867-4c2a-b48c-5c7206893353-error-404-isvc-e4a99-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e4a99-predictor-f849b4d65-x75zg\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:02.134747 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.134721 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f5d59d1-3867-4c2a-b48c-5c7206893353-proxy-tls\") pod \"error-404-isvc-e4a99-predictor-f849b4d65-x75zg\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:02.140701 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.140675 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57cw6\" (UniqueName: \"kubernetes.io/projected/8f5d59d1-3867-4c2a-b48c-5c7206893353-kube-api-access-57cw6\") pod \"error-404-isvc-e4a99-predictor-f849b4d65-x75zg\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:02.223405 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.223369 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:02.349197 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.349167 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg"] Apr 17 17:54:02.351700 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:54:02.351662 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f5d59d1_3867_4c2a_b48c_5c7206893353.slice/crio-282da75d0281dd921d333137b42bd1d53edce985928510a72152927ac2eb6f3c WatchSource:0}: Error finding container 282da75d0281dd921d333137b42bd1d53edce985928510a72152927ac2eb6f3c: Status 404 returned error can't find the container with id 282da75d0281dd921d333137b42bd1d53edce985928510a72152927ac2eb6f3c Apr 17 17:54:02.353318 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.353302 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:54:02.692957 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.692862 2580 generic.go:358] "Generic (PLEG): container finished" podID="e4155761-7757-439a-a6ab-446a33381869" containerID="a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a" exitCode=2 Apr 17 17:54:02.692957 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.692935 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" event={"ID":"e4155761-7757-439a-a6ab-446a33381869","Type":"ContainerDied","Data":"a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a"} Apr 17 17:54:02.694560 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.694539 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" event={"ID":"8f5d59d1-3867-4c2a-b48c-5c7206893353","Type":"ContainerStarted","Data":"a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600"} Apr 17 17:54:02.694649 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.694565 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" event={"ID":"8f5d59d1-3867-4c2a-b48c-5c7206893353","Type":"ContainerStarted","Data":"d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5"} Apr 17 17:54:02.694649 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.694595 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" event={"ID":"8f5d59d1-3867-4c2a-b48c-5c7206893353","Type":"ContainerStarted","Data":"282da75d0281dd921d333137b42bd1d53edce985928510a72152927ac2eb6f3c"} Apr 17 17:54:02.694716 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.694688 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:02.712315 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:02.712256 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" podStartSLOduration=1.71223803 podStartE2EDuration="1.71223803s" podCreationTimestamp="2026-04-17 17:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:54:02.711940298 +0000 UTC m=+1808.683223638" watchObservedRunningTime="2026-04-17 17:54:02.71223803 +0000 UTC m=+1808.683521373" Apr 17 17:54:03.698088 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:03.698044 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:03.699373 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:03.699341 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 17 17:54:04.700826 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:04.700783 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 17 17:54:04.813358 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:04.813314 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 17 17:54:04.818771 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:04.818745 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 17 17:54:05.077545 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.077509 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:54:05.160401 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.160371 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f6qf\" (UniqueName: \"kubernetes.io/projected/e4155761-7757-439a-a6ab-446a33381869-kube-api-access-8f6qf\") pod \"e4155761-7757-439a-a6ab-446a33381869\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " Apr 17 17:54:05.160609 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.160429 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e4155761-7757-439a-a6ab-446a33381869-proxy-tls\") pod \"e4155761-7757-439a-a6ab-446a33381869\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " Apr 17 17:54:05.160609 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.160464 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-313c7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e4155761-7757-439a-a6ab-446a33381869-error-404-isvc-313c7-kube-rbac-proxy-sar-config\") pod \"e4155761-7757-439a-a6ab-446a33381869\" (UID: \"e4155761-7757-439a-a6ab-446a33381869\") " Apr 17 17:54:05.160942 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.160912 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4155761-7757-439a-a6ab-446a33381869-error-404-isvc-313c7-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-313c7-kube-rbac-proxy-sar-config") pod "e4155761-7757-439a-a6ab-446a33381869" (UID: "e4155761-7757-439a-a6ab-446a33381869"). InnerVolumeSpecName "error-404-isvc-313c7-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:54:05.162475 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.162449 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4155761-7757-439a-a6ab-446a33381869-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e4155761-7757-439a-a6ab-446a33381869" (UID: "e4155761-7757-439a-a6ab-446a33381869"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:54:05.162564 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.162525 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4155761-7757-439a-a6ab-446a33381869-kube-api-access-8f6qf" (OuterVolumeSpecName: "kube-api-access-8f6qf") pod "e4155761-7757-439a-a6ab-446a33381869" (UID: "e4155761-7757-439a-a6ab-446a33381869"). InnerVolumeSpecName "kube-api-access-8f6qf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:54:05.262074 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.261980 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e4155761-7757-439a-a6ab-446a33381869-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:54:05.262074 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.262015 2580 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-313c7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e4155761-7757-439a-a6ab-446a33381869-error-404-isvc-313c7-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:54:05.262074 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.262030 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8f6qf\" (UniqueName: \"kubernetes.io/projected/e4155761-7757-439a-a6ab-446a33381869-kube-api-access-8f6qf\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:54:05.705074 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.705041 2580 generic.go:358] "Generic (PLEG): container finished" podID="e4155761-7757-439a-a6ab-446a33381869" containerID="30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa" exitCode=0 Apr 17 17:54:05.705493 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.705120 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" Apr 17 17:54:05.705493 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.705125 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" event={"ID":"e4155761-7757-439a-a6ab-446a33381869","Type":"ContainerDied","Data":"30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa"} Apr 17 17:54:05.705493 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.705164 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx" event={"ID":"e4155761-7757-439a-a6ab-446a33381869","Type":"ContainerDied","Data":"be7c764d351aab767a37da3a387715eee08923d3703af19936d7ae4710af94e3"} Apr 17 17:54:05.705493 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.705184 2580 scope.go:117] "RemoveContainer" containerID="a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a" Apr 17 17:54:05.716594 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.716553 2580 scope.go:117] "RemoveContainer" containerID="30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa" Apr 17 17:54:05.724415 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.724389 2580 scope.go:117] "RemoveContainer" containerID="a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a" Apr 17 17:54:05.724738 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:54:05.724718 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a\": container with ID starting with a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a not found: ID does not exist" containerID="a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a" Apr 17 17:54:05.724810 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.724746 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a"} err="failed to get container status \"a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a\": rpc error: code = NotFound desc = could not find container \"a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a\": container with ID starting with a489e662e24de0891ff607e74c83d5ba97da15e4990a2e8f5ea703ccb935388a not found: ID does not exist" Apr 17 17:54:05.724810 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.724769 2580 scope.go:117] "RemoveContainer" containerID="30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa" Apr 17 17:54:05.725010 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:54:05.724995 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa\": container with ID starting with 30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa not found: ID does not exist" containerID="30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa" Apr 17 17:54:05.725056 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.725011 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa"} err="failed to get container status \"30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa\": rpc error: code = NotFound desc = could not find container \"30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa\": container with ID starting with 30063d77313d4d7d4c2328652a094185aedfb066870c374cf2971d2d8148b6aa not found: ID does not exist" Apr 17 17:54:05.730125 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.730100 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx"] Apr 17 17:54:05.732290 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:05.732269 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-313c7-predictor-5555fd9c67-rqprx"] Apr 17 17:54:06.641656 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:06.641621 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4155761-7757-439a-a6ab-446a33381869" path="/var/lib/kubelet/pods/e4155761-7757-439a-a6ab-446a33381869/volumes" Apr 17 17:54:09.705186 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:09.705152 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:09.705705 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:09.705628 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 17 17:54:19.706065 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:19.706027 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 17 17:54:29.706049 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:29.706010 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 17 17:54:39.705698 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:39.705638 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 17 17:54:49.706315 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:49.706285 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:54:51.386289 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.386234 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4"] Apr 17 17:54:51.386728 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.386523 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kserve-container" containerID="cri-o://d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec" gracePeriod=30 Apr 17 17:54:51.386728 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.386572 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kube-rbac-proxy" containerID="cri-o://cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f" gracePeriod=30 Apr 17 17:54:51.479909 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.479873 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42"] Apr 17 17:54:51.480243 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.480230 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kube-rbac-proxy" Apr 17 17:54:51.480292 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.480244 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kube-rbac-proxy" Apr 17 17:54:51.480292 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.480256 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kserve-container" Apr 17 17:54:51.480292 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.480261 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kserve-container" Apr 17 17:54:51.480390 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.480324 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kube-rbac-proxy" Apr 17 17:54:51.480390 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.480335 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4155761-7757-439a-a6ab-446a33381869" containerName="kserve-container" Apr 17 17:54:51.483609 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.483574 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:51.491448 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.491406 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-0283f-predictor-serving-cert\"" Apr 17 17:54:51.492216 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.491815 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-0283f-kube-rbac-proxy-sar-config\"" Apr 17 17:54:51.503533 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.503507 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42"] Apr 17 17:54:51.542238 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.542207 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c96035-1ce5-497f-8381-40ed551ff7fe-proxy-tls\") pod \"error-404-isvc-0283f-predictor-5c787cfb59-k6k42\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:51.542400 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.542271 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-0283f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5c96035-1ce5-497f-8381-40ed551ff7fe-error-404-isvc-0283f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0283f-predictor-5c787cfb59-k6k42\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:51.542400 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.542310 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9zqr\" (UniqueName: \"kubernetes.io/projected/b5c96035-1ce5-497f-8381-40ed551ff7fe-kube-api-access-t9zqr\") pod \"error-404-isvc-0283f-predictor-5c787cfb59-k6k42\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:51.643475 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.643376 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-0283f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5c96035-1ce5-497f-8381-40ed551ff7fe-error-404-isvc-0283f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0283f-predictor-5c787cfb59-k6k42\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:51.643475 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.643422 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9zqr\" (UniqueName: \"kubernetes.io/projected/b5c96035-1ce5-497f-8381-40ed551ff7fe-kube-api-access-t9zqr\") pod \"error-404-isvc-0283f-predictor-5c787cfb59-k6k42\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:51.643475 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.643471 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c96035-1ce5-497f-8381-40ed551ff7fe-proxy-tls\") pod \"error-404-isvc-0283f-predictor-5c787cfb59-k6k42\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:51.644087 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.644060 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-0283f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5c96035-1ce5-497f-8381-40ed551ff7fe-error-404-isvc-0283f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-0283f-predictor-5c787cfb59-k6k42\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:51.646101 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.646074 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c96035-1ce5-497f-8381-40ed551ff7fe-proxy-tls\") pod \"error-404-isvc-0283f-predictor-5c787cfb59-k6k42\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:51.651806 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.651777 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9zqr\" (UniqueName: \"kubernetes.io/projected/b5c96035-1ce5-497f-8381-40ed551ff7fe-kube-api-access-t9zqr\") pod \"error-404-isvc-0283f-predictor-5c787cfb59-k6k42\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:51.795869 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.795838 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:51.860289 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.860181 2580 generic.go:358] "Generic (PLEG): container finished" podID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerID="cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f" exitCode=2 Apr 17 17:54:51.860289 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.860209 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" event={"ID":"4664528d-dfcc-4452-8664-0c3273aca6cc","Type":"ContainerDied","Data":"cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f"} Apr 17 17:54:51.925621 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:51.925521 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42"] Apr 17 17:54:51.929115 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:54:51.929083 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5c96035_1ce5_497f_8381_40ed551ff7fe.slice/crio-61686bb0789d8575019a59cc31533fbe55e02e6190ec10c385383527004bafc5 WatchSource:0}: Error finding container 61686bb0789d8575019a59cc31533fbe55e02e6190ec10c385383527004bafc5: Status 404 returned error can't find the container with id 61686bb0789d8575019a59cc31533fbe55e02e6190ec10c385383527004bafc5 Apr 17 17:54:52.865078 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:52.865044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" event={"ID":"b5c96035-1ce5-497f-8381-40ed551ff7fe","Type":"ContainerStarted","Data":"89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d"} Apr 17 17:54:52.865078 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:52.865082 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" event={"ID":"b5c96035-1ce5-497f-8381-40ed551ff7fe","Type":"ContainerStarted","Data":"c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc"} Apr 17 17:54:52.865524 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:52.865094 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" event={"ID":"b5c96035-1ce5-497f-8381-40ed551ff7fe","Type":"ContainerStarted","Data":"61686bb0789d8575019a59cc31533fbe55e02e6190ec10c385383527004bafc5"} Apr 17 17:54:52.865524 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:52.865185 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:52.883499 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:52.883437 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" podStartSLOduration=1.883416745 podStartE2EDuration="1.883416745s" podCreationTimestamp="2026-04-17 17:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:54:52.882918603 +0000 UTC m=+1858.854201944" watchObservedRunningTime="2026-04-17 17:54:52.883416745 +0000 UTC m=+1858.854700086" Apr 17 17:54:53.868703 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:53.868671 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:53.869990 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:53.869959 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 17 17:54:54.733204 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.733181 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:54:54.771644 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.771543 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4664528d-dfcc-4452-8664-0c3273aca6cc-proxy-tls\") pod \"4664528d-dfcc-4452-8664-0c3273aca6cc\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " Apr 17 17:54:54.773625 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.773593 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4664528d-dfcc-4452-8664-0c3273aca6cc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4664528d-dfcc-4452-8664-0c3273aca6cc" (UID: "4664528d-dfcc-4452-8664-0c3273aca6cc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:54:54.872047 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.872018 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-dd512-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4664528d-dfcc-4452-8664-0c3273aca6cc-error-404-isvc-dd512-kube-rbac-proxy-sar-config\") pod \"4664528d-dfcc-4452-8664-0c3273aca6cc\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " Apr 17 17:54:54.872471 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.872092 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsjhv\" (UniqueName: \"kubernetes.io/projected/4664528d-dfcc-4452-8664-0c3273aca6cc-kube-api-access-dsjhv\") pod \"4664528d-dfcc-4452-8664-0c3273aca6cc\" (UID: \"4664528d-dfcc-4452-8664-0c3273aca6cc\") " Apr 17 17:54:54.872471 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.872338 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4664528d-dfcc-4452-8664-0c3273aca6cc-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:54:54.872471 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.872435 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4664528d-dfcc-4452-8664-0c3273aca6cc-error-404-isvc-dd512-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-dd512-kube-rbac-proxy-sar-config") pod "4664528d-dfcc-4452-8664-0c3273aca6cc" (UID: "4664528d-dfcc-4452-8664-0c3273aca6cc"). InnerVolumeSpecName "error-404-isvc-dd512-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:54:54.873041 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.873015 2580 generic.go:358] "Generic (PLEG): container finished" podID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerID="d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec" exitCode=0 Apr 17 17:54:54.873110 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.873098 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" Apr 17 17:54:54.873169 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.873102 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" event={"ID":"4664528d-dfcc-4452-8664-0c3273aca6cc","Type":"ContainerDied","Data":"d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec"} Apr 17 17:54:54.873169 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.873147 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4" event={"ID":"4664528d-dfcc-4452-8664-0c3273aca6cc","Type":"ContainerDied","Data":"32876f66101b8d709fab3b44e87cd61f6a7322c17085022dbffdeee75c862822"} Apr 17 17:54:54.873277 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.873170 2580 scope.go:117] "RemoveContainer" containerID="cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f" Apr 17 17:54:54.873512 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.873479 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 17 17:54:54.874614 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.874572 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4664528d-dfcc-4452-8664-0c3273aca6cc-kube-api-access-dsjhv" (OuterVolumeSpecName: "kube-api-access-dsjhv") pod "4664528d-dfcc-4452-8664-0c3273aca6cc" (UID: "4664528d-dfcc-4452-8664-0c3273aca6cc"). InnerVolumeSpecName "kube-api-access-dsjhv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:54:54.884537 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.884517 2580 scope.go:117] "RemoveContainer" containerID="d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec" Apr 17 17:54:54.892148 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.892122 2580 scope.go:117] "RemoveContainer" containerID="cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f" Apr 17 17:54:54.892411 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:54:54.892391 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f\": container with ID starting with cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f not found: ID does not exist" containerID="cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f" Apr 17 17:54:54.892462 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.892421 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f"} err="failed to get container status \"cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f\": rpc error: code = NotFound desc = could not find container \"cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f\": container with ID starting with cbc26bc222dab85b0a247dd82db036ddf64e75ec83fa1e04ca171da748f1f75f not found: ID does not exist" Apr 17 17:54:54.892462 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.892440 2580 scope.go:117] "RemoveContainer" containerID="d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec" Apr 17 17:54:54.892678 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:54:54.892662 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec\": container with ID starting with d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec not found: ID does not exist" containerID="d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec" Apr 17 17:54:54.892720 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.892683 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec"} err="failed to get container status \"d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec\": rpc error: code = NotFound desc = could not find container \"d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec\": container with ID starting with d812b31b886e41f38cf81668d83e7d523c8a2451d167005819816debb7911cec not found: ID does not exist" Apr 17 17:54:54.973639 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.973602 2580 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-dd512-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4664528d-dfcc-4452-8664-0c3273aca6cc-error-404-isvc-dd512-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:54:54.973639 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:54.973631 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dsjhv\" (UniqueName: \"kubernetes.io/projected/4664528d-dfcc-4452-8664-0c3273aca6cc-kube-api-access-dsjhv\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:54:55.195079 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:55.195045 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4"] Apr 17 17:54:55.198549 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:55.198522 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dd512-predictor-844bd6fb77-gxzv4"] Apr 17 17:54:56.640804 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:56.640768 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" path="/var/lib/kubelet/pods/4664528d-dfcc-4452-8664-0c3273aca6cc/volumes" Apr 17 17:54:59.877541 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:59.877513 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:54:59.878101 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:54:59.878070 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 17 17:55:09.878629 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:09.878551 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 17 17:55:12.165968 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.165933 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg"] Apr 17 17:55:12.166440 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.166304 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kserve-container" containerID="cri-o://d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5" gracePeriod=30 Apr 17 17:55:12.166509 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.166449 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kube-rbac-proxy" containerID="cri-o://a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600" gracePeriod=30 Apr 17 17:55:12.202616 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.202558 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt"] Apr 17 17:55:12.202961 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.202940 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kube-rbac-proxy" Apr 17 17:55:12.202961 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.202957 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kube-rbac-proxy" Apr 17 17:55:12.203147 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.203002 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kserve-container" Apr 17 17:55:12.203147 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.203010 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kserve-container" Apr 17 17:55:12.203147 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.203097 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kube-rbac-proxy" Apr 17 17:55:12.203147 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.203112 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4664528d-dfcc-4452-8664-0c3273aca6cc" containerName="kserve-container" Apr 17 17:55:12.207764 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.207742 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.210394 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.210364 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a935a-predictor-serving-cert\"" Apr 17 17:55:12.210394 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.210383 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-a935a-kube-rbac-proxy-sar-config\"" Apr 17 17:55:12.216472 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.216297 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt"] Apr 17 17:55:12.331556 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.331513 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a249c6aa-09d4-40df-b18b-674b50f2e08e-proxy-tls\") pod \"error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.331756 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.331566 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-a935a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a249c6aa-09d4-40df-b18b-674b50f2e08e-error-404-isvc-a935a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.331756 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.331656 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6jjr\" (UniqueName: \"kubernetes.io/projected/a249c6aa-09d4-40df-b18b-674b50f2e08e-kube-api-access-x6jjr\") pod \"error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.432194 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.432096 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6jjr\" (UniqueName: \"kubernetes.io/projected/a249c6aa-09d4-40df-b18b-674b50f2e08e-kube-api-access-x6jjr\") pod \"error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.432365 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.432211 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a249c6aa-09d4-40df-b18b-674b50f2e08e-proxy-tls\") pod \"error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.432365 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.432231 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-a935a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a249c6aa-09d4-40df-b18b-674b50f2e08e-error-404-isvc-a935a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.432877 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.432857 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-a935a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a249c6aa-09d4-40df-b18b-674b50f2e08e-error-404-isvc-a935a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.434758 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.434736 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a249c6aa-09d4-40df-b18b-674b50f2e08e-proxy-tls\") pod \"error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.440365 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.440343 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6jjr\" (UniqueName: \"kubernetes.io/projected/a249c6aa-09d4-40df-b18b-674b50f2e08e-kube-api-access-x6jjr\") pod \"error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.520115 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.520068 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.642986 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.642961 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt"] Apr 17 17:55:12.645045 ip-10-0-130-19 kubenswrapper[2580]: W0417 17:55:12.645017 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda249c6aa_09d4_40df_b18b_674b50f2e08e.slice/crio-7c807e4e61c1a735fb992feb55d57febe09ab984f691de68e71879cd74d711a0 WatchSource:0}: Error finding container 7c807e4e61c1a735fb992feb55d57febe09ab984f691de68e71879cd74d711a0: Status 404 returned error can't find the container with id 7c807e4e61c1a735fb992feb55d57febe09ab984f691de68e71879cd74d711a0 Apr 17 17:55:12.940461 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.940368 2580 generic.go:358] "Generic (PLEG): container finished" podID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerID="a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600" exitCode=2 Apr 17 17:55:12.940461 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.940445 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" event={"ID":"8f5d59d1-3867-4c2a-b48c-5c7206893353","Type":"ContainerDied","Data":"a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600"} Apr 17 17:55:12.941925 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.941898 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" event={"ID":"a249c6aa-09d4-40df-b18b-674b50f2e08e","Type":"ContainerStarted","Data":"c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d"} Apr 17 17:55:12.942057 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.941931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" event={"ID":"a249c6aa-09d4-40df-b18b-674b50f2e08e","Type":"ContainerStarted","Data":"65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1"} Apr 17 17:55:12.942057 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.941943 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" event={"ID":"a249c6aa-09d4-40df-b18b-674b50f2e08e","Type":"ContainerStarted","Data":"7c807e4e61c1a735fb992feb55d57febe09ab984f691de68e71879cd74d711a0"} Apr 17 17:55:12.942196 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.942141 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.942196 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.942170 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:12.943404 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.943380 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 17:55:12.959511 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:12.959464 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" podStartSLOduration=0.959449367 podStartE2EDuration="959.449367ms" podCreationTimestamp="2026-04-17 17:55:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:55:12.958297695 +0000 UTC m=+1878.929581059" watchObservedRunningTime="2026-04-17 17:55:12.959449367 +0000 UTC m=+1878.930732708" Apr 17 17:55:13.946196 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:13.946152 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 17:55:14.701089 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:14.701033 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 17 17:55:15.419094 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.419071 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:55:15.557568 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.557478 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-e4a99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f5d59d1-3867-4c2a-b48c-5c7206893353-error-404-isvc-e4a99-kube-rbac-proxy-sar-config\") pod \"8f5d59d1-3867-4c2a-b48c-5c7206893353\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " Apr 17 17:55:15.557568 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.557567 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57cw6\" (UniqueName: \"kubernetes.io/projected/8f5d59d1-3867-4c2a-b48c-5c7206893353-kube-api-access-57cw6\") pod \"8f5d59d1-3867-4c2a-b48c-5c7206893353\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " Apr 17 17:55:15.557792 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.557628 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f5d59d1-3867-4c2a-b48c-5c7206893353-proxy-tls\") pod \"8f5d59d1-3867-4c2a-b48c-5c7206893353\" (UID: \"8f5d59d1-3867-4c2a-b48c-5c7206893353\") " Apr 17 17:55:15.557929 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.557888 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f5d59d1-3867-4c2a-b48c-5c7206893353-error-404-isvc-e4a99-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-e4a99-kube-rbac-proxy-sar-config") pod "8f5d59d1-3867-4c2a-b48c-5c7206893353" (UID: "8f5d59d1-3867-4c2a-b48c-5c7206893353"). InnerVolumeSpecName "error-404-isvc-e4a99-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:55:15.559590 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.559554 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5d59d1-3867-4c2a-b48c-5c7206893353-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8f5d59d1-3867-4c2a-b48c-5c7206893353" (UID: "8f5d59d1-3867-4c2a-b48c-5c7206893353"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:55:15.559670 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.559650 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5d59d1-3867-4c2a-b48c-5c7206893353-kube-api-access-57cw6" (OuterVolumeSpecName: "kube-api-access-57cw6") pod "8f5d59d1-3867-4c2a-b48c-5c7206893353" (UID: "8f5d59d1-3867-4c2a-b48c-5c7206893353"). InnerVolumeSpecName "kube-api-access-57cw6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:55:15.658422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.658391 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57cw6\" (UniqueName: \"kubernetes.io/projected/8f5d59d1-3867-4c2a-b48c-5c7206893353-kube-api-access-57cw6\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:55:15.658422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.658419 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f5d59d1-3867-4c2a-b48c-5c7206893353-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:55:15.658422 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.658430 2580 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-e4a99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f5d59d1-3867-4c2a-b48c-5c7206893353-error-404-isvc-e4a99-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 17:55:15.953749 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.953706 2580 generic.go:358] "Generic (PLEG): container finished" podID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerID="d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5" exitCode=0 Apr 17 17:55:15.953904 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.953743 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" event={"ID":"8f5d59d1-3867-4c2a-b48c-5c7206893353","Type":"ContainerDied","Data":"d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5"} Apr 17 17:55:15.953904 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.953786 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" Apr 17 17:55:15.953904 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.953800 2580 scope.go:117] "RemoveContainer" containerID="a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600" Apr 17 17:55:15.953904 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.953789 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg" event={"ID":"8f5d59d1-3867-4c2a-b48c-5c7206893353","Type":"ContainerDied","Data":"282da75d0281dd921d333137b42bd1d53edce985928510a72152927ac2eb6f3c"} Apr 17 17:55:15.962515 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.962495 2580 scope.go:117] "RemoveContainer" containerID="d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5" Apr 17 17:55:15.969726 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.969704 2580 scope.go:117] "RemoveContainer" containerID="a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600" Apr 17 17:55:15.969955 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:55:15.969937 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600\": container with ID starting with a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600 not found: ID does not exist" containerID="a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600" Apr 17 17:55:15.970001 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.969963 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600"} err="failed to get container status \"a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600\": rpc error: code = NotFound desc = could not find container \"a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600\": container with ID starting with a2b4603c9d1dd72defb9b96646ee86a2f04d0fd6df60f2a867a14cdf1e280600 not found: ID does not exist" Apr 17 17:55:15.970001 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.969984 2580 scope.go:117] "RemoveContainer" containerID="d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5" Apr 17 17:55:15.970171 ip-10-0-130-19 kubenswrapper[2580]: E0417 17:55:15.970157 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5\": container with ID starting with d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5 not found: ID does not exist" containerID="d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5" Apr 17 17:55:15.970211 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.970175 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5"} err="failed to get container status \"d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5\": rpc error: code = NotFound desc = could not find container \"d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5\": container with ID starting with d4783d05e4939a4202b9f716ade292c067b40566a8633528a85dbbc9edf342c5 not found: ID does not exist" Apr 17 17:55:15.975333 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.975310 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg"] Apr 17 17:55:15.979010 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:15.978990 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e4a99-predictor-f849b4d65-x75zg"] Apr 17 17:55:16.642209 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:16.642171 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" path="/var/lib/kubelet/pods/8f5d59d1-3867-4c2a-b48c-5c7206893353/volumes" Apr 17 17:55:18.950936 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:18.950903 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:55:18.951420 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:18.951387 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 17:55:19.878063 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:19.878017 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 17 17:55:28.951741 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:28.951701 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 17:55:29.878528 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:29.878484 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 17 17:55:38.952004 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:38.951952 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 17:55:39.879286 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:39.879254 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 17:55:48.951429 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:48.951388 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 17:55:58.952409 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:55:58.952378 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 17:58:54.682872 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:58:54.682765 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 17:58:54.706666 ip-10-0-130-19 kubenswrapper[2580]: I0417 17:58:54.706641 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 18:03:54.706722 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:03:54.706562 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 18:03:54.730896 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:03:54.730869 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 18:04:26.842554 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:26.842473 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt"] Apr 17 18:04:26.843139 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:26.842816 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kserve-container" containerID="cri-o://65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1" gracePeriod=30 Apr 17 18:04:26.843139 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:26.842843 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kube-rbac-proxy" containerID="cri-o://c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d" gracePeriod=30 Apr 17 18:04:27.828364 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:27.828324 2580 generic.go:358] "Generic (PLEG): container finished" podID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerID="c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d" exitCode=2 Apr 17 18:04:27.828549 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:27.828405 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" event={"ID":"a249c6aa-09d4-40df-b18b-674b50f2e08e","Type":"ContainerDied","Data":"c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d"} Apr 17 18:04:28.946412 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:28.946368 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.42:8643/healthz\": dial tcp 10.133.0.42:8643: connect: connection refused" Apr 17 18:04:28.951755 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:28.951725 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 17 18:04:29.988151 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:29.988115 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 18:04:30.150608 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.150545 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a249c6aa-09d4-40df-b18b-674b50f2e08e-proxy-tls\") pod \"a249c6aa-09d4-40df-b18b-674b50f2e08e\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " Apr 17 18:04:30.150608 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.150613 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6jjr\" (UniqueName: \"kubernetes.io/projected/a249c6aa-09d4-40df-b18b-674b50f2e08e-kube-api-access-x6jjr\") pod \"a249c6aa-09d4-40df-b18b-674b50f2e08e\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " Apr 17 18:04:30.150873 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.150641 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-a935a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a249c6aa-09d4-40df-b18b-674b50f2e08e-error-404-isvc-a935a-kube-rbac-proxy-sar-config\") pod \"a249c6aa-09d4-40df-b18b-674b50f2e08e\" (UID: \"a249c6aa-09d4-40df-b18b-674b50f2e08e\") " Apr 17 18:04:30.151051 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.151020 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a249c6aa-09d4-40df-b18b-674b50f2e08e-error-404-isvc-a935a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-a935a-kube-rbac-proxy-sar-config") pod "a249c6aa-09d4-40df-b18b-674b50f2e08e" (UID: "a249c6aa-09d4-40df-b18b-674b50f2e08e"). InnerVolumeSpecName "error-404-isvc-a935a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:04:30.152773 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.152743 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a249c6aa-09d4-40df-b18b-674b50f2e08e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a249c6aa-09d4-40df-b18b-674b50f2e08e" (UID: "a249c6aa-09d4-40df-b18b-674b50f2e08e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:04:30.152890 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.152771 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a249c6aa-09d4-40df-b18b-674b50f2e08e-kube-api-access-x6jjr" (OuterVolumeSpecName: "kube-api-access-x6jjr") pod "a249c6aa-09d4-40df-b18b-674b50f2e08e" (UID: "a249c6aa-09d4-40df-b18b-674b50f2e08e"). InnerVolumeSpecName "kube-api-access-x6jjr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:04:30.251438 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.251395 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a249c6aa-09d4-40df-b18b-674b50f2e08e-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 18:04:30.251438 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.251429 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6jjr\" (UniqueName: \"kubernetes.io/projected/a249c6aa-09d4-40df-b18b-674b50f2e08e-kube-api-access-x6jjr\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 18:04:30.251438 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.251441 2580 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-a935a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a249c6aa-09d4-40df-b18b-674b50f2e08e-error-404-isvc-a935a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 18:04:30.841032 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.840990 2580 generic.go:358] "Generic (PLEG): container finished" podID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerID="65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1" exitCode=0 Apr 17 18:04:30.841251 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.841069 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" Apr 17 18:04:30.841251 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.841067 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" event={"ID":"a249c6aa-09d4-40df-b18b-674b50f2e08e","Type":"ContainerDied","Data":"65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1"} Apr 17 18:04:30.841251 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.841172 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt" event={"ID":"a249c6aa-09d4-40df-b18b-674b50f2e08e","Type":"ContainerDied","Data":"7c807e4e61c1a735fb992feb55d57febe09ab984f691de68e71879cd74d711a0"} Apr 17 18:04:30.841251 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.841193 2580 scope.go:117] "RemoveContainer" containerID="c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d" Apr 17 18:04:30.850345 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.850326 2580 scope.go:117] "RemoveContainer" containerID="65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1" Apr 17 18:04:30.858371 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.858338 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt"] Apr 17 18:04:30.859494 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.859316 2580 scope.go:117] "RemoveContainer" containerID="c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d" Apr 17 18:04:30.859629 ip-10-0-130-19 kubenswrapper[2580]: E0417 18:04:30.859610 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d\": container with ID starting with c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d not found: ID does not exist" containerID="c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d" Apr 17 18:04:30.859680 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.859638 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d"} err="failed to get container status \"c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d\": rpc error: code = NotFound desc = could not find container \"c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d\": container with ID starting with c7518f3e7ee1636468e37daf9cef87e8cdaee8dac273b664ba9e603ea7860a7d not found: ID does not exist" Apr 17 18:04:30.859680 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.859657 2580 scope.go:117] "RemoveContainer" containerID="65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1" Apr 17 18:04:30.859908 ip-10-0-130-19 kubenswrapper[2580]: E0417 18:04:30.859888 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1\": container with ID starting with 65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1 not found: ID does not exist" containerID="65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1" Apr 17 18:04:30.859957 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.859914 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1"} err="failed to get container status \"65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1\": rpc error: code = NotFound desc = could not find container \"65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1\": container with ID starting with 65f3d3d843d2db7cd0447b8a7459627e9c51206f0768eceb22797d91af90e4f1 not found: ID does not exist" Apr 17 18:04:30.864262 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:30.864239 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a935a-predictor-79fc75c4ff-vbnnt"] Apr 17 18:04:32.641165 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:04:32.641111 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" path="/var/lib/kubelet/pods/a249c6aa-09d4-40df-b18b-674b50f2e08e/volumes" Apr 17 18:08:54.728761 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:08:54.728656 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 18:08:54.754616 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:08:54.754568 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 18:12:10.974554 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:10.974523 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42"] Apr 17 18:12:10.975193 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:10.974835 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kserve-container" containerID="cri-o://c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc" gracePeriod=30 Apr 17 18:12:10.975193 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:10.974870 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kube-rbac-proxy" containerID="cri-o://89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d" gracePeriod=30 Apr 17 18:12:11.377513 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:11.377479 2580 generic.go:358] "Generic (PLEG): container finished" podID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerID="89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d" exitCode=2 Apr 17 18:12:11.377723 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:11.377553 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" event={"ID":"b5c96035-1ce5-497f-8381-40ed551ff7fe","Type":"ContainerDied","Data":"89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d"} Apr 17 18:12:14.113808 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.113782 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 18:12:14.129399 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.129372 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9zqr\" (UniqueName: \"kubernetes.io/projected/b5c96035-1ce5-497f-8381-40ed551ff7fe-kube-api-access-t9zqr\") pod \"b5c96035-1ce5-497f-8381-40ed551ff7fe\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " Apr 17 18:12:14.129519 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.129423 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c96035-1ce5-497f-8381-40ed551ff7fe-proxy-tls\") pod \"b5c96035-1ce5-497f-8381-40ed551ff7fe\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " Apr 17 18:12:14.129519 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.129459 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-0283f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5c96035-1ce5-497f-8381-40ed551ff7fe-error-404-isvc-0283f-kube-rbac-proxy-sar-config\") pod \"b5c96035-1ce5-497f-8381-40ed551ff7fe\" (UID: \"b5c96035-1ce5-497f-8381-40ed551ff7fe\") " Apr 17 18:12:14.129888 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.129862 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c96035-1ce5-497f-8381-40ed551ff7fe-error-404-isvc-0283f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-0283f-kube-rbac-proxy-sar-config") pod "b5c96035-1ce5-497f-8381-40ed551ff7fe" (UID: "b5c96035-1ce5-497f-8381-40ed551ff7fe"). InnerVolumeSpecName "error-404-isvc-0283f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:12:14.131805 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.131774 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c96035-1ce5-497f-8381-40ed551ff7fe-kube-api-access-t9zqr" (OuterVolumeSpecName: "kube-api-access-t9zqr") pod "b5c96035-1ce5-497f-8381-40ed551ff7fe" (UID: "b5c96035-1ce5-497f-8381-40ed551ff7fe"). InnerVolumeSpecName "kube-api-access-t9zqr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:12:14.132115 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.132084 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c96035-1ce5-497f-8381-40ed551ff7fe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b5c96035-1ce5-497f-8381-40ed551ff7fe" (UID: "b5c96035-1ce5-497f-8381-40ed551ff7fe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:12:14.231090 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.230994 2580 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c96035-1ce5-497f-8381-40ed551ff7fe-proxy-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 18:12:14.231090 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.231037 2580 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-0283f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5c96035-1ce5-497f-8381-40ed551ff7fe-error-404-isvc-0283f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 18:12:14.231090 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.231048 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t9zqr\" (UniqueName: \"kubernetes.io/projected/b5c96035-1ce5-497f-8381-40ed551ff7fe-kube-api-access-t9zqr\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 17 18:12:14.387953 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.387912 2580 generic.go:358] "Generic (PLEG): container finished" podID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerID="c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc" exitCode=0 Apr 17 18:12:14.388149 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.387973 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" event={"ID":"b5c96035-1ce5-497f-8381-40ed551ff7fe","Type":"ContainerDied","Data":"c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc"} Apr 17 18:12:14.388149 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.387990 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" Apr 17 18:12:14.388149 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.388006 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42" event={"ID":"b5c96035-1ce5-497f-8381-40ed551ff7fe","Type":"ContainerDied","Data":"61686bb0789d8575019a59cc31533fbe55e02e6190ec10c385383527004bafc5"} Apr 17 18:12:14.388149 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.388022 2580 scope.go:117] "RemoveContainer" containerID="89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d" Apr 17 18:12:14.396445 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.396424 2580 scope.go:117] "RemoveContainer" containerID="c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc" Apr 17 18:12:14.403487 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.403468 2580 scope.go:117] "RemoveContainer" containerID="89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d" Apr 17 18:12:14.403771 ip-10-0-130-19 kubenswrapper[2580]: E0417 18:12:14.403749 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d\": container with ID starting with 89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d not found: ID does not exist" containerID="89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d" Apr 17 18:12:14.403857 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.403775 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d"} err="failed to get container status \"89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d\": rpc error: code = NotFound desc = could not find container \"89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d\": container with ID starting with 89dcb90139bc8bdf1f0176c1ea1f91d86ef6f68ae55157e66fc5598116cbea4d not found: ID does not exist" Apr 17 18:12:14.403857 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.403792 2580 scope.go:117] "RemoveContainer" containerID="c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc" Apr 17 18:12:14.404002 ip-10-0-130-19 kubenswrapper[2580]: E0417 18:12:14.403982 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc\": container with ID starting with c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc not found: ID does not exist" containerID="c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc" Apr 17 18:12:14.404044 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.404007 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc"} err="failed to get container status \"c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc\": rpc error: code = NotFound desc = could not find container \"c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc\": container with ID starting with c931bf406ec6057937ba8a8cb91d4a14b61bb2587ad4e74dbb36313fcf299cdc not found: ID does not exist" Apr 17 18:12:14.408708 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.408686 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42"] Apr 17 18:12:14.411663 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.411642 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0283f-predictor-5c787cfb59-k6k42"] Apr 17 18:12:14.641560 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:14.641527 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" path="/var/lib/kubelet/pods/b5c96035-1ce5-497f-8381-40ed551ff7fe/volumes" Apr 17 18:12:37.254492 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254452 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kkzt9/must-gather-qchzb"] Apr 17 18:12:37.254888 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254809 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kserve-container" Apr 17 18:12:37.254888 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254823 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kserve-container" Apr 17 18:12:37.254888 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254838 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kube-rbac-proxy" Apr 17 18:12:37.254888 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254845 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kube-rbac-proxy" Apr 17 18:12:37.254888 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254854 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kserve-container" Apr 17 18:12:37.254888 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254860 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kserve-container" Apr 17 18:12:37.254888 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254866 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kube-rbac-proxy" Apr 17 18:12:37.254888 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254871 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kube-rbac-proxy" Apr 17 18:12:37.254888 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254881 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kserve-container" Apr 17 18:12:37.254888 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254886 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kserve-container" Apr 17 18:12:37.255193 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254900 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kube-rbac-proxy" Apr 17 18:12:37.255193 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254908 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kube-rbac-proxy" Apr 17 18:12:37.255193 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254959 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kserve-container" Apr 17 18:12:37.255193 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254968 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kube-rbac-proxy" Apr 17 18:12:37.255193 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254977 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a249c6aa-09d4-40df-b18b-674b50f2e08e" containerName="kserve-container" Apr 17 18:12:37.255193 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254988 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5c96035-1ce5-497f-8381-40ed551ff7fe" containerName="kube-rbac-proxy" Apr 17 18:12:37.255193 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.254996 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kube-rbac-proxy" Apr 17 18:12:37.255193 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.255003 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f5d59d1-3867-4c2a-b48c-5c7206893353" containerName="kserve-container" Apr 17 18:12:37.257938 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.257921 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kkzt9/must-gather-qchzb" Apr 17 18:12:37.260339 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.260310 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kkzt9\"/\"openshift-service-ca.crt\"" Apr 17 18:12:37.261360 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.261340 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kkzt9\"/\"default-dockercfg-jtmx4\"" Apr 17 18:12:37.261490 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.261378 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kkzt9\"/\"kube-root-ca.crt\"" Apr 17 18:12:37.266236 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.266212 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kkzt9/must-gather-qchzb"] Apr 17 18:12:37.312646 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.312612 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8v8\" (UniqueName: \"kubernetes.io/projected/ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c-kube-api-access-nh8v8\") pod \"must-gather-qchzb\" (UID: \"ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c\") " pod="openshift-must-gather-kkzt9/must-gather-qchzb" Apr 17 18:12:37.312813 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.312675 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c-must-gather-output\") pod \"must-gather-qchzb\" (UID: \"ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c\") " pod="openshift-must-gather-kkzt9/must-gather-qchzb" Apr 17 18:12:37.413971 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.413929 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8v8\" (UniqueName: \"kubernetes.io/projected/ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c-kube-api-access-nh8v8\") pod \"must-gather-qchzb\" (UID: \"ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c\") " pod="openshift-must-gather-kkzt9/must-gather-qchzb" Apr 17 18:12:37.414170 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.413998 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c-must-gather-output\") pod \"must-gather-qchzb\" (UID: \"ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c\") " pod="openshift-must-gather-kkzt9/must-gather-qchzb" Apr 17 18:12:37.414339 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.414319 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c-must-gather-output\") pod \"must-gather-qchzb\" (UID: \"ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c\") " pod="openshift-must-gather-kkzt9/must-gather-qchzb" Apr 17 18:12:37.421827 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.421803 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8v8\" (UniqueName: \"kubernetes.io/projected/ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c-kube-api-access-nh8v8\") pod \"must-gather-qchzb\" (UID: \"ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c\") " pod="openshift-must-gather-kkzt9/must-gather-qchzb" Apr 17 18:12:37.568303 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.568193 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kkzt9/must-gather-qchzb" Apr 17 18:12:37.688461 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.688436 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kkzt9/must-gather-qchzb"] Apr 17 18:12:37.690893 ip-10-0-130-19 kubenswrapper[2580]: W0417 18:12:37.690867 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podceb1e6c3_5f14_41e8_9b9c_33dadf465a7c.slice/crio-b8bd32ac5ea20876af84cfd14055ab1b84f8669f63c32db5cb6d1304e1adb583 WatchSource:0}: Error finding container b8bd32ac5ea20876af84cfd14055ab1b84f8669f63c32db5cb6d1304e1adb583: Status 404 returned error can't find the container with id b8bd32ac5ea20876af84cfd14055ab1b84f8669f63c32db5cb6d1304e1adb583 Apr 17 18:12:37.692739 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:37.692722 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:12:38.474152 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:38.474114 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kkzt9/must-gather-qchzb" event={"ID":"ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c","Type":"ContainerStarted","Data":"b8bd32ac5ea20876af84cfd14055ab1b84f8669f63c32db5cb6d1304e1adb583"} Apr 17 18:12:39.480737 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:39.480695 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kkzt9/must-gather-qchzb" event={"ID":"ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c","Type":"ContainerStarted","Data":"41bf19fbf75483150aa4171f86865af25272f645a59727e668cb8b2a5f9fa8e7"} Apr 17 18:12:39.481264 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:39.481240 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kkzt9/must-gather-qchzb" event={"ID":"ceb1e6c3-5f14-41e8-9b9c-33dadf465a7c","Type":"ContainerStarted","Data":"a9beef76afdb90f9b9cc99db249fd0430d699a1a3688e0d436280fb692486fc0"} Apr 17 18:12:39.499225 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:39.499170 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kkzt9/must-gather-qchzb" podStartSLOduration=1.654756192 podStartE2EDuration="2.499155639s" podCreationTimestamp="2026-04-17 18:12:37 +0000 UTC" firstStartedPulling="2026-04-17 18:12:37.692876116 +0000 UTC m=+2923.664159434" lastFinishedPulling="2026-04-17 18:12:38.537275548 +0000 UTC m=+2924.508558881" observedRunningTime="2026-04-17 18:12:39.498325237 +0000 UTC m=+2925.469608592" watchObservedRunningTime="2026-04-17 18:12:39.499155639 +0000 UTC m=+2925.470438983" Apr 17 18:12:39.946276 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:39.946245 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8cqlg_9458a330-4a73-457d-a605-d7998538c01b/global-pull-secret-syncer/0.log" Apr 17 18:12:40.102795 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:40.102758 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nz4ff_3f262b32-c02c-41bc-be72-1f8ea9896bfd/konnectivity-agent/0.log" Apr 17 18:12:40.124224 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:40.124188 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-19.ec2.internal_a5e99fc6db543cf6951686e44ee274cc/haproxy/0.log" Apr 17 18:12:43.480728 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.480698 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a04f1967-3898-40ca-9ed7-804412fa3235/alertmanager/0.log" Apr 17 18:12:43.505986 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.505948 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a04f1967-3898-40ca-9ed7-804412fa3235/config-reloader/0.log" Apr 17 18:12:43.530593 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.530551 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a04f1967-3898-40ca-9ed7-804412fa3235/kube-rbac-proxy-web/0.log" Apr 17 18:12:43.554403 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.554377 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a04f1967-3898-40ca-9ed7-804412fa3235/kube-rbac-proxy/0.log" Apr 17 18:12:43.591514 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.591489 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a04f1967-3898-40ca-9ed7-804412fa3235/kube-rbac-proxy-metric/0.log" Apr 17 18:12:43.620144 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.620107 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a04f1967-3898-40ca-9ed7-804412fa3235/prom-label-proxy/0.log" Apr 17 18:12:43.643744 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.643706 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a04f1967-3898-40ca-9ed7-804412fa3235/init-config-reloader/0.log" Apr 17 18:12:43.708816 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.708787 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-5dqjj_4cb39368-8e2f-4db5-bae2-1b7b4455394f/kube-state-metrics/0.log" Apr 17 18:12:43.730262 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.730235 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-5dqjj_4cb39368-8e2f-4db5-bae2-1b7b4455394f/kube-rbac-proxy-main/0.log" Apr 17 18:12:43.752651 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.752551 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-5dqjj_4cb39368-8e2f-4db5-bae2-1b7b4455394f/kube-rbac-proxy-self/0.log" Apr 17 18:12:43.809802 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.809770 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-s954w_a67cfe6e-d4aa-4c24-9313-a4be369b3f41/monitoring-plugin/0.log" Apr 17 18:12:43.837921 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.837885 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-njmkn_1fa41c7d-0098-4482-ad82-48d0da635522/node-exporter/0.log" Apr 17 18:12:43.859561 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.859528 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-njmkn_1fa41c7d-0098-4482-ad82-48d0da635522/kube-rbac-proxy/0.log" Apr 17 18:12:43.882160 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:43.882128 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-njmkn_1fa41c7d-0098-4482-ad82-48d0da635522/init-textfile/0.log" Apr 17 18:12:44.039388 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:44.039305 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-mjgmx_efba4076-5827-49c0-8be0-d5c74c47988c/kube-rbac-proxy-main/0.log" Apr 17 18:12:44.058195 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:44.058168 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-mjgmx_efba4076-5827-49c0-8be0-d5c74c47988c/kube-rbac-proxy-self/0.log" Apr 17 18:12:44.077174 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:44.077142 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-mjgmx_efba4076-5827-49c0-8be0-d5c74c47988c/openshift-state-metrics/0.log" Apr 17 18:12:44.304439 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:44.304353 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-qst7c_f527a836-fa7a-4cf4-8ad3-586bee36a45b/prometheus-operator-admission-webhook/0.log" Apr 17 18:12:44.334725 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:44.334684 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-c5699c7c9-9txtc_88afd659-9be2-49eb-b958-426fa64e4320/telemeter-client/0.log" Apr 17 18:12:44.357863 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:44.357831 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-c5699c7c9-9txtc_88afd659-9be2-49eb-b958-426fa64e4320/reload/0.log" Apr 17 18:12:44.376892 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:44.376861 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-c5699c7c9-9txtc_88afd659-9be2-49eb-b958-426fa64e4320/kube-rbac-proxy/0.log" Apr 17 18:12:46.463463 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:46.463426 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9786d7974-l64jm_7c5e1428-d554-470d-bace-00baf72c619f/console/0.log" Apr 17 18:12:47.169376 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.169341 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj"] Apr 17 18:12:47.174098 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.174066 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.187653 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.187625 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj"] Apr 17 18:12:47.308940 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.308903 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-lib-modules\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.308940 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.308939 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-proc\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.309174 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.308957 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2glnp\" (UniqueName: \"kubernetes.io/projected/f9abed3a-43c0-457f-b9f9-019445fee7ff-kube-api-access-2glnp\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.309174 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.309036 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-podres\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.309174 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.309076 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-sys\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.410430 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.410399 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-lib-modules\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.410430 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.410433 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-proc\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.410701 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.410454 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2glnp\" (UniqueName: \"kubernetes.io/projected/f9abed3a-43c0-457f-b9f9-019445fee7ff-kube-api-access-2glnp\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.410701 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.410482 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-podres\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.410701 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.410506 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-sys\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.410701 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.410535 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-proc\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.410701 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.410605 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-lib-modules\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.410701 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.410612 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-sys\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.410701 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.410662 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f9abed3a-43c0-457f-b9f9-019445fee7ff-podres\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.419286 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.419258 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2glnp\" (UniqueName: \"kubernetes.io/projected/f9abed3a-43c0-457f-b9f9-019445fee7ff-kube-api-access-2glnp\") pod \"perf-node-gather-daemonset-phxmj\" (UID: \"f9abed3a-43c0-457f-b9f9-019445fee7ff\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.484837 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.484752 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:47.591304 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.591274 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h5vmx_c063b8d8-8182-438f-a272-69a64fcbb153/dns/0.log" Apr 17 18:12:47.624053 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.624019 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h5vmx_c063b8d8-8182-438f-a272-69a64fcbb153/kube-rbac-proxy/0.log" Apr 17 18:12:47.636899 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.636873 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj"] Apr 17 18:12:47.639712 ip-10-0-130-19 kubenswrapper[2580]: W0417 18:12:47.639684 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf9abed3a_43c0_457f_b9f9_019445fee7ff.slice/crio-3ffc98c69ad30643fc109394b271b929c92e4fcabfef2f1c59dc711851125b36 WatchSource:0}: Error finding container 3ffc98c69ad30643fc109394b271b929c92e4fcabfef2f1c59dc711851125b36: Status 404 returned error can't find the container with id 3ffc98c69ad30643fc109394b271b929c92e4fcabfef2f1c59dc711851125b36 Apr 17 18:12:47.788986 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:47.788906 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pch4m_23dab589-f077-4e94-93bc-392122228de4/dns-node-resolver/0.log" Apr 17 18:12:48.256347 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:48.256310 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8db7c_9d2d7141-d6ee-4b25-bef5-a3cc8d5c413c/node-ca/0.log" Apr 17 18:12:48.522179 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:48.522088 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" event={"ID":"f9abed3a-43c0-457f-b9f9-019445fee7ff","Type":"ContainerStarted","Data":"d3b69aa92a880fadde920f5baa803d5a58f000f4e44597578f83a9c624c5753e"} Apr 17 18:12:48.522179 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:48.522131 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" event={"ID":"f9abed3a-43c0-457f-b9f9-019445fee7ff","Type":"ContainerStarted","Data":"3ffc98c69ad30643fc109394b271b929c92e4fcabfef2f1c59dc711851125b36"} Apr 17 18:12:48.522179 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:48.522161 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:48.541686 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:48.541636 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" podStartSLOduration=1.541612254 podStartE2EDuration="1.541612254s" podCreationTimestamp="2026-04-17 18:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:12:48.53965226 +0000 UTC m=+2934.510935600" watchObservedRunningTime="2026-04-17 18:12:48.541612254 +0000 UTC m=+2934.512895595" Apr 17 18:12:49.297197 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:49.297154 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l6nv9_1586f132-dd9c-4636-a7c7-87b1b730dc01/serve-healthcheck-canary/0.log" Apr 17 18:12:49.746031 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:49.745994 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d966q_70f0e0b4-2719-4e2b-95f2-d115223c13dd/kube-rbac-proxy/0.log" Apr 17 18:12:49.766405 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:49.766375 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d966q_70f0e0b4-2719-4e2b-95f2-d115223c13dd/exporter/0.log" Apr 17 18:12:49.786268 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:49.786237 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d966q_70f0e0b4-2719-4e2b-95f2-d115223c13dd/extractor/0.log" Apr 17 18:12:51.751950 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:51.751893 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-fkxfv_a50faf80-a870-4209-9a73-8dc84fd00c4b/manager/0.log" Apr 17 18:12:52.021473 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:52.021382 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-dc8r8_435b8490-b210-434f-b30e-2543e0137e4f/s3-init/0.log" Apr 17 18:12:52.050418 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:52.050371 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-vm8r6_516ed833-a365-4252-945b-a1f54e70350b/seaweedfs/0.log" Apr 17 18:12:54.537000 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:54.536970 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-phxmj" Apr 17 18:12:55.616095 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:55.616059 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-5xd9l_e4d9e99e-ca09-49e3-a1d6-e5beebfc6147/migrator/0.log" Apr 17 18:12:55.640188 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:55.640145 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-5xd9l_e4d9e99e-ca09-49e3-a1d6-e5beebfc6147/graceful-termination/0.log" Apr 17 18:12:56.980494 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:56.980453 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-72h2h_8f4942e8-dd1f-4333-b5fc-5aaeb1efedb2/kube-multus/0.log" Apr 17 18:12:57.003686 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:57.003657 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5nrtn_afb0cf40-4c7d-4082-a5f2-64ef60067cde/kube-multus-additional-cni-plugins/0.log" Apr 17 18:12:57.026312 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:57.026289 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5nrtn_afb0cf40-4c7d-4082-a5f2-64ef60067cde/egress-router-binary-copy/0.log" Apr 17 18:12:57.047712 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:57.047687 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5nrtn_afb0cf40-4c7d-4082-a5f2-64ef60067cde/cni-plugins/0.log" Apr 17 18:12:57.071282 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:57.071248 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5nrtn_afb0cf40-4c7d-4082-a5f2-64ef60067cde/bond-cni-plugin/0.log" Apr 17 18:12:57.097281 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:57.097253 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5nrtn_afb0cf40-4c7d-4082-a5f2-64ef60067cde/routeoverride-cni/0.log" Apr 17 18:12:57.120953 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:57.120923 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5nrtn_afb0cf40-4c7d-4082-a5f2-64ef60067cde/whereabouts-cni-bincopy/0.log" Apr 17 18:12:57.142659 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:57.142632 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5nrtn_afb0cf40-4c7d-4082-a5f2-64ef60067cde/whereabouts-cni/0.log" Apr 17 18:12:57.493272 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:57.493246 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-knvfd_1227f475-d747-4720-ad95-d72a46d6d1fb/network-metrics-daemon/0.log" Apr 17 18:12:57.511200 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:57.511170 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-knvfd_1227f475-d747-4720-ad95-d72a46d6d1fb/kube-rbac-proxy/0.log" Apr 17 18:12:59.067398 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:59.067319 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-controller/0.log" Apr 17 18:12:59.081706 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:59.081674 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/0.log" Apr 17 18:12:59.108072 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:59.108039 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovn-acl-logging/1.log" Apr 17 18:12:59.130527 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:59.130496 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/kube-rbac-proxy-node/0.log" Apr 17 18:12:59.152782 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:59.152753 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 18:12:59.168983 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:59.168950 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/northd/0.log" Apr 17 18:12:59.187112 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:59.187073 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/nbdb/0.log" Apr 17 18:12:59.205447 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:59.205419 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/sbdb/0.log" Apr 17 18:12:59.390273 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:12:59.390239 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rjptt_6afc6d79-46b9-4af3-84d9-3ed59a13c61a/ovnkube-controller/0.log" Apr 17 18:13:00.308638 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:13:00.308538 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4qnzz_850cf630-0fb1-482f-9e3d-a1525bdf6a39/network-check-target-container/0.log" Apr 17 18:13:01.237441 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:13:01.237411 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-sj6nb_45be972b-ce44-43f8-9b8b-860260b4c7ab/iptables-alerter/0.log" Apr 17 18:13:01.826087 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:13:01.826055 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8s5r4_31b6121d-8e98-43cd-84cf-8f938f63e6bd/tuned/0.log" Apr 17 18:13:03.435937 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:13:03.435907 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-d8pmq_3f6de31f-775d-42c4-9aa8-91d5f855192b/cluster-samples-operator/0.log" Apr 17 18:13:03.450526 ip-10-0-130-19 kubenswrapper[2580]: I0417 18:13:03.450489 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-d8pmq_3f6de31f-775d-42c4-9aa8-91d5f855192b/cluster-samples-operator-watch/0.log"