Apr 21 10:03:51.537388 ip-10-0-129-84 systemd[1]: Starting Kubernetes Kubelet... Apr 21 10:03:51.972062 ip-10-0-129-84 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:51.972062 ip-10-0-129-84 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 10:03:51.972062 ip-10-0-129-84 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:51.972062 ip-10-0-129-84 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 10:03:51.972062 ip-10-0-129-84 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:51.973606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.973519 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 10:03:51.979640 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979624 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:51.979640 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979639 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979643 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979647 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979650 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979654 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979662 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979666 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979669 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979672 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979675 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979677 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979680 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979683 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979685 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979688 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979690 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979693 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979696 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979701 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:51.979703 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979705 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979708 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979711 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979714 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979717 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979720 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979722 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979725 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979728 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979730 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979733 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979735 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979738 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979744 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979747 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979749 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979752 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979755 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979757 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:51.980173 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979759 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979763 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979767 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979772 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979774 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979777 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979780 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979783 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979786 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979789 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979791 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979794 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979796 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979799 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979802 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979805 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979808 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979811 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979813 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:51.980620 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979816 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979818 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979821 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979824 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979827 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979829 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979832 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979834 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979837 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979840 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979843 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979845 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979848 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979850 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979852 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979855 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979858 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979860 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979863 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979865 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:51.981090 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979869 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979871 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979874 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979876 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979879 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979882 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979884 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.979886 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980314 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980319 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980322 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980325 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980328 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980330 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980333 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980336 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980338 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980341 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980343 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980346 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:51.981591 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980352 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980355 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980358 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980360 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980363 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980366 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980368 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980371 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980374 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980376 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980379 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980381 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980384 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980387 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980389 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980391 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980394 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980397 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980400 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:51.982075 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980402 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980405 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980408 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980411 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980413 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980416 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980418 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980421 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980424 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980426 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980429 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980431 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980434 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980436 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980439 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980441 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980444 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980446 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980450 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:51.982560 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980452 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980456 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980460 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980462 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980465 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980468 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980471 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980474 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980477 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980480 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980482 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980485 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980488 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980490 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980493 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980496 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980498 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980501 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980503 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:51.983035 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980506 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980508 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980511 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980513 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980516 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980518 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980521 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980524 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980527 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980531 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980533 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980536 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980538 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980542 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980544 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980546 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.980549 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982133 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982142 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982148 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982153 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 10:03:51.983532 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982157 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982161 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982165 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982170 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982174 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982177 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982181 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982184 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982187 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982190 2567 flags.go:64] FLAG: --cgroup-root="" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982193 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982196 2567 flags.go:64] FLAG: --client-ca-file="" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982199 2567 flags.go:64] FLAG: --cloud-config="" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982202 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982205 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982209 2567 flags.go:64] FLAG: --cluster-domain="" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982212 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982215 2567 flags.go:64] FLAG: --config-dir="" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982218 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982221 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982225 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982228 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982232 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982235 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 10:03:51.984050 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982238 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982241 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982244 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982247 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982250 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982254 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982258 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982261 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982264 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982267 2567 flags.go:64] FLAG: --enable-server="true" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982270 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982274 2567 flags.go:64] FLAG: --event-burst="100" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982277 2567 flags.go:64] FLAG: --event-qps="50" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982280 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982284 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982287 2567 flags.go:64] FLAG: --eviction-hard="" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982290 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982293 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982297 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982300 2567 flags.go:64] FLAG: --eviction-soft="" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982303 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982306 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982309 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982311 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982314 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 10:03:51.984658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982318 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982320 2567 flags.go:64] FLAG: --feature-gates="" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982325 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982328 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982331 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982335 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982338 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982341 2567 flags.go:64] FLAG: --help="false" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982344 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-129-84.ec2.internal" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982347 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982350 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982353 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982356 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982360 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982363 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982367 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982370 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982373 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982376 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982379 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982382 2567 flags.go:64] FLAG: --kube-reserved="" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982385 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982388 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 10:03:51.985311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982390 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982393 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982396 2567 flags.go:64] FLAG: --lock-file="" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982399 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982402 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982405 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982410 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982413 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982416 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982419 2567 flags.go:64] FLAG: --logging-format="text" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982422 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982425 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982428 2567 flags.go:64] FLAG: --manifest-url="" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982431 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982435 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982439 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982443 2567 flags.go:64] FLAG: --max-pods="110" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982446 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982449 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982452 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982455 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982458 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982461 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982464 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982471 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 10:03:51.985877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982474 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982477 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982480 2567 flags.go:64] FLAG: --pod-cidr="" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982483 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982488 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982491 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982494 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982497 2567 flags.go:64] FLAG: --port="10250" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982500 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982503 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0076b33f22b2a3016" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982507 2567 flags.go:64] FLAG: --qos-reserved="" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982509 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982512 2567 flags.go:64] FLAG: --register-node="true" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982515 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982518 2567 flags.go:64] FLAG: --register-with-taints="" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982522 2567 flags.go:64] FLAG: --registry-burst="10" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982525 2567 flags.go:64] FLAG: --registry-qps="5" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982527 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982531 2567 flags.go:64] FLAG: --reserved-memory="" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982535 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982538 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982541 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982544 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982547 2567 flags.go:64] FLAG: --runonce="false" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982550 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 10:03:51.986546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982553 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982556 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982559 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982561 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982564 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982571 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982574 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982577 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982580 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982583 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982586 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982589 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982593 2567 flags.go:64] FLAG: --system-cgroups="" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982596 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982601 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982604 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982607 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982611 2567 flags.go:64] FLAG: --tls-min-version="" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982614 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982617 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982620 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982623 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982626 2567 flags.go:64] FLAG: --v="2" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982631 2567 flags.go:64] FLAG: --version="false" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982635 2567 flags.go:64] FLAG: --vmodule="" Apr 21 10:03:51.987158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982640 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.982644 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982740 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982744 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982747 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982750 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982754 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982757 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982760 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982763 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982766 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982769 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982773 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982776 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982778 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982781 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982783 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982786 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982788 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:51.987766 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982792 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982794 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982797 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982799 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982802 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982805 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982807 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982810 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982812 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982815 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982818 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982820 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982824 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982827 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982829 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982832 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982835 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982837 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982840 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982842 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:51.988279 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982845 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982849 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982853 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982856 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982858 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982862 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982865 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982869 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982872 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982875 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982878 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982881 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982883 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982886 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982889 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982892 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982894 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982897 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982900 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:51.988932 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982902 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982905 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982907 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982910 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982913 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982919 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982922 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982925 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982927 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982930 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982932 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982936 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982938 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982941 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982944 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982947 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982949 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982951 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982955 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982958 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:51.989751 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982960 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:51.990575 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982963 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:51.990575 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982966 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:51.990575 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982969 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:51.990575 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982971 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:51.990575 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982974 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:51.990575 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982977 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:51.990575 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982979 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:51.990575 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982981 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:51.990575 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.982984 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:51.990575 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.983817 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:51.991688 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.991668 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 10:03:51.991688 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.991688 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 10:03:51.991760 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991738 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:51.991760 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991744 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:51.991760 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991747 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:51.991760 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991751 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:51.991760 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991753 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:51.991760 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991756 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:51.991760 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991758 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:51.991760 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991762 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:51.991760 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991765 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991768 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991770 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991773 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991775 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991778 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991781 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991783 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991786 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991789 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991791 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991794 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991796 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991799 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991802 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991805 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991807 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991810 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991812 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:51.991980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991815 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991817 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991820 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991822 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991825 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991828 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991830 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991833 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991835 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991838 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991840 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991843 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991846 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991848 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991851 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991854 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991857 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991862 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991866 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991870 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:51.992457 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991873 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991876 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991879 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991882 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991884 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991887 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991890 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991893 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991896 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991898 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991901 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991904 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991906 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991909 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991912 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991914 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991917 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991919 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991922 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991925 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:51.992950 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991927 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991929 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991932 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991934 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991939 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991942 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991945 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991948 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991951 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991955 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991958 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991960 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991963 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991966 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991968 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991970 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991973 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991976 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:51.993491 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.991978 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.991983 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992079 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992084 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992087 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992090 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992094 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992096 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992100 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992103 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992105 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992124 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992127 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992130 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992133 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992136 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:51.993940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992138 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992141 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992143 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992146 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992148 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992151 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992155 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992159 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992164 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992167 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992170 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992173 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992175 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992178 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992181 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992183 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992186 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992188 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992191 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:51.994345 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992193 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992196 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992199 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992202 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992204 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992206 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992209 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992212 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992214 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992217 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992219 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992221 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992224 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992227 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992229 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992232 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992234 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992237 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992240 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992243 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:51.994815 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992246 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992248 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992252 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992255 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992257 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992260 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992262 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992265 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992268 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992270 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992273 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992276 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992278 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992281 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992283 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992286 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992288 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992291 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992293 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992296 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:51.995323 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992298 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992301 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992303 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992306 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992308 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992311 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992313 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992316 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992320 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992323 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992326 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992329 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:51.992332 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.992337 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:51.995826 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.993150 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 10:03:51.998706 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:51.998692 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 10:03:52.000103 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.000091 2567 server.go:1019] "Starting client certificate rotation" Apr 21 10:03:52.000300 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.000204 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:52.000300 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.000249 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:52.024560 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.024541 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:52.030810 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.030794 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:52.042991 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.042973 2567 log.go:25] "Validated CRI v1 runtime API" Apr 21 10:03:52.048826 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.048809 2567 log.go:25] "Validated CRI v1 image API" Apr 21 10:03:52.050545 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.050529 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 10:03:52.053324 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.053309 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:52.054372 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.054353 2567 fs.go:135] Filesystem UUIDs: map[1b2b7474-9e77-429b-af30-41d40e508a33:/dev/nvme0n1p4 214743fd-b951-4208-9668-dbfaf511a87c:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 21 10:03:52.054477 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.054371 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 10:03:52.060987 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.060869 2567 manager.go:217] Machine: {Timestamp:2026-04-21 10:03:52.058568427 +0000 UTC m=+0.403018303 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100892 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fdfb9287eb0047d767da960415329 SystemUUID:ec2fdfb9-287e-b004-7d76-7da960415329 BootID:458884db-8086-4be2-af2f-3a272a3bdd7b Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ad:59:90:7e:63 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ad:59:90:7e:63 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:82:ad:c7:8c:c8:fa Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 10:03:52.060987 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.060974 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 10:03:52.061137 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.061082 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 10:03:52.063780 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.063758 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 10:03:52.063911 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.063783 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-84.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 10:03:52.063954 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.063922 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 10:03:52.063954 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.063930 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 10:03:52.063954 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.063943 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:52.064033 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.063955 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:52.065255 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.065244 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:52.065357 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.065349 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 10:03:52.067720 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.067710 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 21 10:03:52.067754 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.067723 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 10:03:52.067754 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.067735 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 10:03:52.067754 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.067743 2567 kubelet.go:397] "Adding apiserver pod source" Apr 21 10:03:52.067754 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.067751 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 10:03:52.068838 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.068823 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:52.068838 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.068841 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:52.071980 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.071962 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 10:03:52.073190 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.073177 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 10:03:52.075130 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075102 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 10:03:52.075191 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075134 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 10:03:52.075191 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075141 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 10:03:52.075191 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075146 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 10:03:52.075191 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075152 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 10:03:52.075191 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075158 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 10:03:52.075191 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075164 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 10:03:52.075191 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075170 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 10:03:52.075191 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075178 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 10:03:52.075191 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075184 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 10:03:52.075191 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075193 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 10:03:52.075456 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.075202 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 10:03:52.076262 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.076241 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kmk4g" Apr 21 10:03:52.077148 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.077137 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 10:03:52.077207 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.077153 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 10:03:52.079824 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.079805 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-84.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 10:03:52.079899 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.079856 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 10:03:52.080666 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.080651 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 10:03:52.080749 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.080693 2567 server.go:1295] "Started kubelet" Apr 21 10:03:52.080799 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.080772 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 10:03:52.080902 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.080852 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 10:03:52.080940 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.080929 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 10:03:52.081504 ip-10-0-129-84 systemd[1]: Started Kubernetes Kubelet. Apr 21 10:03:52.082054 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.081885 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 10:03:52.082714 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.082697 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 21 10:03:52.084357 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.084331 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kmk4g" Apr 21 10:03:52.090356 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.090174 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 10:03:52.090356 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.090182 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:52.091123 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.091093 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 10:03:52.091209 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.091164 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 10:03:52.091329 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.091295 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:52.092123 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.092069 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 10:03:52.092239 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.092229 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 10:03:52.092442 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.092419 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 21 10:03:52.092442 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.092427 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 21 10:03:52.092837 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.092802 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 10:03:52.092966 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.092948 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:52.093094 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.092953 2567 factory.go:55] Registering systemd factory Apr 21 10:03:52.093175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.093101 2567 factory.go:223] Registration of the systemd container factory successfully Apr 21 10:03:52.093376 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.093355 2567 factory.go:153] Registering CRI-O factory Apr 21 10:03:52.093376 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.093372 2567 factory.go:223] Registration of the crio container factory successfully Apr 21 10:03:52.093498 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.093395 2567 factory.go:103] Registering Raw factory Apr 21 10:03:52.093498 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.093408 2567 manager.go:1196] Started watching for new ooms in manager Apr 21 10:03:52.094168 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.094156 2567 manager.go:319] Starting recovery of all containers Apr 21 10:03:52.095876 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.095850 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-84.ec2.internal" not found Apr 21 10:03:52.095976 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.095885 2567 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-84.ec2.internal\" not found" node="ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.106459 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.106443 2567 manager.go:324] Recovery completed Apr 21 10:03:52.110435 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.110422 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.112235 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.112219 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-84.ec2.internal" not found Apr 21 10:03:52.112593 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.112580 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.112639 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.112611 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.112639 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.112623 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.113049 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.113035 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 10:03:52.113099 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.113049 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 10:03:52.113099 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.113069 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:52.115008 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.114998 2567 policy_none.go:49] "None policy: Start" Apr 21 10:03:52.115045 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.115015 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 10:03:52.115045 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.115024 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 21 10:03:52.162667 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.162653 2567 manager.go:341] "Starting Device Plugin manager" Apr 21 10:03:52.165193 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.162694 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 10:03:52.165193 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.162706 2567 server.go:85] "Starting device plugin registration server" Apr 21 10:03:52.165193 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.162927 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 10:03:52.165193 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.162937 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 10:03:52.165193 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.163032 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 10:03:52.165193 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.163167 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 10:03:52.165193 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.163176 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 10:03:52.165193 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.163641 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 10:03:52.165193 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.163672 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:52.172008 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.171991 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-84.ec2.internal" not found Apr 21 10:03:52.250792 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.250724 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 10:03:52.251937 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.251919 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 10:03:52.251999 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.251948 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 10:03:52.251999 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.251969 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 10:03:52.251999 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.251976 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 10:03:52.252158 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.252008 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 10:03:52.255545 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.255523 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:52.263370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.263356 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.264490 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.264476 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.264571 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.264506 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.264571 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.264523 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.264571 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.264551 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.272916 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.272901 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.272981 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.272921 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-84.ec2.internal\": node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:52.307373 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.307353 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:52.352433 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.352391 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-84.ec2.internal"] Apr 21 10:03:52.352521 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.352488 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.353345 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.353331 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.353429 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.353363 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.353429 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.353384 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.354706 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.354692 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.354885 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.354871 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.354934 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.354898 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.355377 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.355355 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.355478 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.355391 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.355478 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.355405 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.355478 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.355454 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.355478 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.355478 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.355623 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.355488 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.356756 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.356742 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.356796 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.356764 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.357523 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.357508 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.357571 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.357533 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.357571 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.357543 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.381662 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.381634 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-84.ec2.internal\" not found" node="ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.385827 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.385812 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-84.ec2.internal\" not found" node="ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.394128 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.394096 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e84065e070e871c05028ff6c3f13e37a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal\" (UID: \"e84065e070e871c05028ff6c3f13e37a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.394216 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.394138 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e84065e070e871c05028ff6c3f13e37a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal\" (UID: \"e84065e070e871c05028ff6c3f13e37a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.394216 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.394199 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/366ead18e7fb69eb4a529be7bbd9f14e-config\") pod \"kube-apiserver-proxy-ip-10-0-129-84.ec2.internal\" (UID: \"366ead18e7fb69eb4a529be7bbd9f14e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.407755 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.407737 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:52.494568 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.494538 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e84065e070e871c05028ff6c3f13e37a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal\" (UID: \"e84065e070e871c05028ff6c3f13e37a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.494710 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.494588 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e84065e070e871c05028ff6c3f13e37a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal\" (UID: \"e84065e070e871c05028ff6c3f13e37a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.494710 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.494545 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e84065e070e871c05028ff6c3f13e37a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal\" (UID: \"e84065e070e871c05028ff6c3f13e37a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.494710 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.494617 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/366ead18e7fb69eb4a529be7bbd9f14e-config\") pod \"kube-apiserver-proxy-ip-10-0-129-84.ec2.internal\" (UID: \"366ead18e7fb69eb4a529be7bbd9f14e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.494710 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.494686 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e84065e070e871c05028ff6c3f13e37a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal\" (UID: \"e84065e070e871c05028ff6c3f13e37a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.494854 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.494691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/366ead18e7fb69eb4a529be7bbd9f14e-config\") pod \"kube-apiserver-proxy-ip-10-0-129-84.ec2.internal\" (UID: \"366ead18e7fb69eb4a529be7bbd9f14e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.508855 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.508810 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:52.609411 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.609387 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:52.683007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.682984 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.688590 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.688573 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-84.ec2.internal" Apr 21 10:03:52.710356 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.710334 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:52.810885 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.810812 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:52.911333 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:52.911307 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:52.999934 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:52.999904 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 10:03:53.000525 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.000038 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:53.000525 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.000086 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:53.012054 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:53.012031 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:53.087696 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.087623 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 09:58:52 +0000 UTC" deadline="2027-10-29 07:14:40.681501268 +0000 UTC" Apr 21 10:03:53.087696 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.087654 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13341h10m47.593852147s" Apr 21 10:03:53.090816 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.090801 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:53.112310 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:53.112287 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:53.113254 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.113238 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:53.139189 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.139167 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-v5fp2" Apr 21 10:03:53.146580 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.146564 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-v5fp2" Apr 21 10:03:53.213353 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:53.213325 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:53.240980 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:53.240954 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode84065e070e871c05028ff6c3f13e37a.slice/crio-a19075ed6830f586d1bcb715f9ceed3b55bf01f910d88745b04240a7642e8a01 WatchSource:0}: Error finding container a19075ed6830f586d1bcb715f9ceed3b55bf01f910d88745b04240a7642e8a01: Status 404 returned error can't find the container with id a19075ed6830f586d1bcb715f9ceed3b55bf01f910d88745b04240a7642e8a01 Apr 21 10:03:53.241217 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:53.241199 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod366ead18e7fb69eb4a529be7bbd9f14e.slice/crio-1751c2c455f993f7ab61c8e3a69af161bdd21e36fad83a17ff6101ec61f23653 WatchSource:0}: Error finding container 1751c2c455f993f7ab61c8e3a69af161bdd21e36fad83a17ff6101ec61f23653: Status 404 returned error can't find the container with id 1751c2c455f993f7ab61c8e3a69af161bdd21e36fad83a17ff6101ec61f23653 Apr 21 10:03:53.245150 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.245132 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:03:53.254406 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.254369 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" event={"ID":"e84065e070e871c05028ff6c3f13e37a","Type":"ContainerStarted","Data":"a19075ed6830f586d1bcb715f9ceed3b55bf01f910d88745b04240a7642e8a01"} Apr 21 10:03:53.255302 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.255277 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-84.ec2.internal" event={"ID":"366ead18e7fb69eb4a529be7bbd9f14e","Type":"ContainerStarted","Data":"1751c2c455f993f7ab61c8e3a69af161bdd21e36fad83a17ff6101ec61f23653"} Apr 21 10:03:53.313776 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:53.313752 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-84.ec2.internal\" not found" Apr 21 10:03:53.389312 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.389255 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:53.391084 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.391069 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-84.ec2.internal" Apr 21 10:03:53.403255 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.403227 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:53.404770 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.404757 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" Apr 21 10:03:53.414841 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.414826 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:53.531977 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:53.531950 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:54.017459 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.017427 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:54.068430 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.068401 2567 apiserver.go:52] "Watching apiserver" Apr 21 10:03:54.075751 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.075724 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 10:03:54.076771 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.076740 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-vr484","openshift-cluster-node-tuning-operator/tuned-z82s8","openshift-dns/node-resolver-d9np2","openshift-multus/multus-additional-cni-plugins-65267","openshift-multus/multus-kchl2","openshift-ovn-kubernetes/ovnkube-node-l2qr9","kube-system/kube-apiserver-proxy-ip-10-0-129-84.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7","openshift-image-registry/node-ca-j2lqn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal","openshift-multus/network-metrics-daemon-vrs72","openshift-network-diagnostics/network-check-target-lf67k","openshift-network-operator/iptables-alerter-p7n2z"] Apr 21 10:03:54.079797 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.079691 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:03:54.079797 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.079796 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:03:54.081022 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.080992 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.081146 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.081089 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d9np2" Apr 21 10:03:54.082873 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.082262 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.083534 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.083513 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.083648 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.083526 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.083648 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.083643 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-l622j\"" Apr 21 10:03:54.083773 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.083644 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-855nf\"" Apr 21 10:03:54.083773 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.083664 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.083945 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.083914 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.085291 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.085041 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.085291 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.085269 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.085618 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.085600 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.085700 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.085687 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:54.086048 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.085884 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 10:03:54.086048 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.085959 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ttg7n\"" Apr 21 10:03:54.086511 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.086446 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.086616 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.086540 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 10:03:54.087025 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.086825 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vr484" Apr 21 10:03:54.087254 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.087229 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 10:03:54.087814 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.087793 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wz6ln\"" Apr 21 10:03:54.087896 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.087857 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 10:03:54.088902 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.088883 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 10:03:54.088993 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.088886 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.089060 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.088951 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.089491 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.089472 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.090261 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.090242 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 10:03:54.090392 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.090374 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 10:03:54.090450 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.090419 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ffrvb\"" Apr 21 10:03:54.090792 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.090762 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2hhlj\"" Apr 21 10:03:54.090792 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.090776 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 10:03:54.090974 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.090933 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 10:03:54.091147 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.091130 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j2lqn" Apr 21 10:03:54.091257 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.091234 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 10:03:54.091401 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.091383 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.091789 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.091768 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 10:03:54.091966 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.091871 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-z99xn\"" Apr 21 10:03:54.092286 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.092152 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.092737 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.092601 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p7n2z" Apr 21 10:03:54.093326 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.092819 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:03:54.093326 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.093073 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:03:54.094170 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.094081 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.094422 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.094403 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 10:03:54.094496 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.094475 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pgq9t\"" Apr 21 10:03:54.095494 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.095443 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.095592 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.095577 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.096067 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.096025 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 10:03:54.096373 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.096355 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-w5c75\"" Apr 21 10:03:54.096984 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.096710 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.104573 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104550 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbqjj\" (UniqueName: \"kubernetes.io/projected/1b8b33c9-4316-4863-843f-730d4490910b-kube-api-access-sbqjj\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:03:54.104668 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104637 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/98faf1b6-99bb-4f34-822a-b471ba610d7d-konnectivity-ca\") pod \"konnectivity-agent-vr484\" (UID: \"98faf1b6-99bb-4f34-822a-b471ba610d7d\") " pod="kube-system/konnectivity-agent-vr484" Apr 21 10:03:54.104725 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104665 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-kubelet\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.104784 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104724 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-etc-openvswitch\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.104784 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104749 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-env-overrides\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.104784 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104772 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a304ded0-8dcb-4d07-b2f5-a18c53303a25-host-slash\") pod \"iptables-alerter-p7n2z\" (UID: \"a304ded0-8dcb-4d07-b2f5-a18c53303a25\") " pod="openshift-network-operator/iptables-alerter-p7n2z" Apr 21 10:03:54.104986 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104796 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-system-cni-dir\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.104986 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104821 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.104986 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104844 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-run-multus-certs\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.104986 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104868 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-sysctl-d\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.104986 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104890 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-run-openvswitch\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.104986 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104920 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-ovnkube-script-lib\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.104986 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104945 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.104990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-multus-conf-dir\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105025 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-run\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105044 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-sys\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105058 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-log-socket\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105072 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-multus-socket-dir-parent\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105095 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105151 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-run-netns\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105173 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvwnt\" (UniqueName: \"kubernetes.io/projected/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-kube-api-access-xvwnt\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105190 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-var-lib-cni-multus\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105212 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq4lh\" (UniqueName: \"kubernetes.io/projected/3b4e60e7-4ce7-450a-83fc-fd1e09de64f9-kube-api-access-hq4lh\") pod \"node-ca-j2lqn\" (UID: \"3b4e60e7-4ce7-450a-83fc-fd1e09de64f9\") " pod="openshift-image-registry/node-ca-j2lqn" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105245 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-host\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.105320 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105299 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-os-release\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105328 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd6pm\" (UniqueName: \"kubernetes.io/projected/e2f35c99-2ff3-4631-9d48-aec797338383-kube-api-access-zd6pm\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105356 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-socket-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105380 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-ovn-node-metrics-cert\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105413 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwsmm\" (UniqueName: \"kubernetes.io/projected/a304ded0-8dcb-4d07-b2f5-a18c53303a25-kube-api-access-wwsmm\") pod \"iptables-alerter-p7n2z\" (UID: \"a304ded0-8dcb-4d07-b2f5-a18c53303a25\") " pod="openshift-network-operator/iptables-alerter-p7n2z" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105433 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31f3696c-a468-44ca-9299-c2d8a53166c8-cni-binary-copy\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105448 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31f3696c-a468-44ca-9299-c2d8a53166c8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105469 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-multus-cni-dir\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105490 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-run-netns\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105504 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-node-log\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105541 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-run-k8s-cni-cncf-io\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105590 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-device-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105631 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-kubernetes\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105656 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gmcb\" (UniqueName: \"kubernetes.io/projected/bd1a8429-25a8-49c9-9dda-7a286fbe2767-kube-api-access-5gmcb\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105686 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-var-lib-openvswitch\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105710 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2f35c99-2ff3-4631-9d48-aec797338383-cni-binary-copy\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.105778 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105733 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/277f3705-c93f-4057-bbda-a7f798e4406d-hosts-file\") pod \"node-resolver-d9np2\" (UID: \"277f3705-c93f-4057-bbda-a7f798e4406d\") " pod="openshift-dns/node-resolver-d9np2" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105758 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-os-release\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105793 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hbm4\" (UniqueName: \"kubernetes.io/projected/31f3696c-a468-44ca-9299-c2d8a53166c8-kube-api-access-8hbm4\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105834 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e2f35c99-2ff3-4631-9d48-aec797338383-multus-daemon-config\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105867 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-sysctl-conf\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105893 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-cni-bin\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105918 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105945 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a304ded0-8dcb-4d07-b2f5-a18c53303a25-iptables-alerter-script\") pod \"iptables-alerter-p7n2z\" (UID: \"a304ded0-8dcb-4d07-b2f5-a18c53303a25\") " pod="openshift-network-operator/iptables-alerter-p7n2z" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105979 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-cnibin\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.105993 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5xmm\" (UniqueName: \"kubernetes.io/projected/a9832641-6b46-4621-997a-be038aea43cb-kube-api-access-l5xmm\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106006 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b4e60e7-4ce7-450a-83fc-fd1e09de64f9-host\") pod \"node-ca-j2lqn\" (UID: \"3b4e60e7-4ce7-450a-83fc-fd1e09de64f9\") " pod="openshift-image-registry/node-ca-j2lqn" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106025 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-lib-modules\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106040 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd1a8429-25a8-49c9-9dda-7a286fbe2767-tmp\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106075 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-run-ovn-kubernetes\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106122 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-var-lib-cni-bin\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106177 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b4e60e7-4ce7-450a-83fc-fd1e09de64f9-serviceca\") pod \"node-ca-j2lqn\" (UID: \"3b4e60e7-4ce7-450a-83fc-fd1e09de64f9\") " pod="openshift-image-registry/node-ca-j2lqn" Apr 21 10:03:54.106354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106209 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndd28\" (UniqueName: \"kubernetes.io/projected/277f3705-c93f-4057-bbda-a7f798e4406d-kube-api-access-ndd28\") pod \"node-resolver-d9np2\" (UID: \"277f3705-c93f-4057-bbda-a7f798e4406d\") " pod="openshift-dns/node-resolver-d9np2" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106231 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-ovnkube-config\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106254 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-var-lib-kubelet\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106276 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/98faf1b6-99bb-4f34-822a-b471ba610d7d-agent-certs\") pod \"konnectivity-agent-vr484\" (UID: \"98faf1b6-99bb-4f34-822a-b471ba610d7d\") " pod="kube-system/konnectivity-agent-vr484" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106300 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-modprobe-d\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-tuned\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106371 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/277f3705-c93f-4057-bbda-a7f798e4406d-tmp-dir\") pod \"node-resolver-d9np2\" (UID: \"277f3705-c93f-4057-bbda-a7f798e4406d\") " pod="openshift-dns/node-resolver-d9np2" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106422 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-cnibin\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106456 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-etc-kubernetes\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106483 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106508 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-registration-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106540 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-sysconfig\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31f3696c-a468-44ca-9299-c2d8a53166c8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106621 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-sys-fs\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106646 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j8f4\" (UniqueName: \"kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4\") pod \"network-check-target-lf67k\" (UID: \"8cb513a2-6bf9-465c-bac3-8b87096c0e4e\") " pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106672 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-systemd\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.106990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106695 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-systemd-units\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.107596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106717 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-run-systemd\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.107596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106738 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-run-ovn\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.107596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106761 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-system-cni-dir\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.107596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106786 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-hostroot\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.107596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106812 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-var-lib-kubelet\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.107596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106833 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-slash\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.107596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.106855 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-cni-netd\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.147124 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.147086 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:53 +0000 UTC" deadline="2027-10-22 20:43:33.908822402 +0000 UTC" Apr 21 10:03:54.147228 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.147129 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13186h39m39.761696906s" Apr 21 10:03:54.192930 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.192908 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 10:03:54.207305 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207278 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-cnibin\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.207395 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207309 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-etc-kubernetes\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.207395 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207330 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.207395 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207352 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-registration-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.207564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207401 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-cnibin\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.207564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207405 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-etc-kubernetes\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.207564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207417 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.207564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207410 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-registration-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.207564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207430 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-sysconfig\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.207564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31f3696c-a468-44ca-9299-c2d8a53166c8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.207564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207523 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-sys-fs\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.207564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207535 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-sysconfig\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.207564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207541 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8f4\" (UniqueName: \"kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4\") pod \"network-check-target-lf67k\" (UID: \"8cb513a2-6bf9-465c-bac3-8b87096c0e4e\") " pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:03:54.207564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207567 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-systemd\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207574 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-sys-fs\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207585 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-systemd-units\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207609 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-run-systemd\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-run-ovn\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207640 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-systemd\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207649 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-systemd-units\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-system-cni-dir\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207692 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-run-systemd\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207699 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-system-cni-dir\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207702 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-hostroot\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207714 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-run-ovn\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207727 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-var-lib-kubelet\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207752 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-slash\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207759 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-hostroot\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207788 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-cni-netd\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207791 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-slash\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207816 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbqjj\" (UniqueName: \"kubernetes.io/projected/1b8b33c9-4316-4863-843f-730d4490910b-kube-api-access-sbqjj\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:03:54.208007 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207822 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-var-lib-kubelet\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207831 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-cni-netd\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207840 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/98faf1b6-99bb-4f34-822a-b471ba610d7d-konnectivity-ca\") pod \"konnectivity-agent-vr484\" (UID: \"98faf1b6-99bb-4f34-822a-b471ba610d7d\") " pod="kube-system/konnectivity-agent-vr484" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207868 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-kubelet\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207892 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-etc-openvswitch\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207933 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-env-overrides\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207948 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-kubelet\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.207980 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a304ded0-8dcb-4d07-b2f5-a18c53303a25-host-slash\") pod \"iptables-alerter-p7n2z\" (UID: \"a304ded0-8dcb-4d07-b2f5-a18c53303a25\") " pod="openshift-network-operator/iptables-alerter-p7n2z" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208012 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-system-cni-dir\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208009 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-etc-openvswitch\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208048 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a304ded0-8dcb-4d07-b2f5-a18c53303a25-host-slash\") pod \"iptables-alerter-p7n2z\" (UID: \"a304ded0-8dcb-4d07-b2f5-a18c53303a25\") " pod="openshift-network-operator/iptables-alerter-p7n2z" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208069 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-system-cni-dir\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208086 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208126 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-run-multus-certs\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208153 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-sysctl-d\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208200 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-run-openvswitch\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208210 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-run-multus-certs\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.208813 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208228 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-ovnkube-script-lib\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208240 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208255 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208290 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-multus-conf-dir\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208294 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-run-openvswitch\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208313 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-run\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208342 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-sys\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208341 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-sysctl-d\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208356 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-log-socket\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208355 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31f3696c-a468-44ca-9299-c2d8a53166c8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208371 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-multus-socket-dir-parent\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.208385 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208395 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208403 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/98faf1b6-99bb-4f34-822a-b471ba610d7d-konnectivity-ca\") pod \"konnectivity-agent-vr484\" (UID: \"98faf1b6-99bb-4f34-822a-b471ba610d7d\") " pod="kube-system/konnectivity-agent-vr484" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208411 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-run-netns\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208392 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-run\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208427 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvwnt\" (UniqueName: \"kubernetes.io/projected/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-kube-api-access-xvwnt\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208455 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-multus-conf-dir\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.209533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208465 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-env-overrides\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.208488 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs podName:1b8b33c9-4316-4863-843f-730d4490910b nodeName:}" failed. No retries permitted until 2026-04-21 10:03:54.708433726 +0000 UTC m=+3.052883593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs") pod "network-metrics-daemon-vrs72" (UID: "1b8b33c9-4316-4863-843f-730d4490910b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208470 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-multus-socket-dir-parent\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208505 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-sys\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208506 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-log-socket\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208548 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-run-netns\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-var-lib-cni-multus\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208587 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-var-lib-cni-multus\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208585 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq4lh\" (UniqueName: \"kubernetes.io/projected/3b4e60e7-4ce7-450a-83fc-fd1e09de64f9-kube-api-access-hq4lh\") pod \"node-ca-j2lqn\" (UID: \"3b4e60e7-4ce7-450a-83fc-fd1e09de64f9\") " pod="openshift-image-registry/node-ca-j2lqn" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208620 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-host\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208644 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-os-release\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208667 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zd6pm\" (UniqueName: \"kubernetes.io/projected/e2f35c99-2ff3-4631-9d48-aec797338383-kube-api-access-zd6pm\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208727 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-os-release\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208727 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-host\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208790 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-socket-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208834 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-ovn-node-metrics-cert\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.210238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208836 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-ovnkube-script-lib\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208860 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwsmm\" (UniqueName: \"kubernetes.io/projected/a304ded0-8dcb-4d07-b2f5-a18c53303a25-kube-api-access-wwsmm\") pod \"iptables-alerter-p7n2z\" (UID: \"a304ded0-8dcb-4d07-b2f5-a18c53303a25\") " pod="openshift-network-operator/iptables-alerter-p7n2z" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208912 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31f3696c-a468-44ca-9299-c2d8a53166c8-cni-binary-copy\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208921 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-socket-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208940 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31f3696c-a468-44ca-9299-c2d8a53166c8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.208966 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-multus-cni-dir\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209020 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-multus-cni-dir\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209045 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-run-netns\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209076 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-node-log\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209085 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209128 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-run-k8s-cni-cncf-io\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209152 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-node-log\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209156 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-device-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209155 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-run-netns\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209181 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-kubernetes\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209195 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-run-k8s-cni-cncf-io\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209209 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gmcb\" (UniqueName: \"kubernetes.io/projected/bd1a8429-25a8-49c9-9dda-7a286fbe2767-kube-api-access-5gmcb\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209222 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a9832641-6b46-4621-997a-be038aea43cb-device-dir\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.211035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209234 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-var-lib-openvswitch\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209249 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-kubernetes\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209285 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-var-lib-openvswitch\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209353 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2f35c99-2ff3-4631-9d48-aec797338383-cni-binary-copy\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209363 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31f3696c-a468-44ca-9299-c2d8a53166c8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209381 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/277f3705-c93f-4057-bbda-a7f798e4406d-hosts-file\") pod \"node-resolver-d9np2\" (UID: \"277f3705-c93f-4057-bbda-a7f798e4406d\") " pod="openshift-dns/node-resolver-d9np2" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209404 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-os-release\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209432 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hbm4\" (UniqueName: \"kubernetes.io/projected/31f3696c-a468-44ca-9299-c2d8a53166c8-kube-api-access-8hbm4\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/277f3705-c93f-4057-bbda-a7f798e4406d-hosts-file\") pod \"node-resolver-d9np2\" (UID: \"277f3705-c93f-4057-bbda-a7f798e4406d\") " pod="openshift-dns/node-resolver-d9np2" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209459 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e2f35c99-2ff3-4631-9d48-aec797338383-multus-daemon-config\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209485 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-sysctl-conf\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209492 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-os-release\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209422 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31f3696c-a468-44ca-9299-c2d8a53166c8-cni-binary-copy\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209508 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-cni-bin\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209540 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-cni-bin\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209548 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209575 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a304ded0-8dcb-4d07-b2f5-a18c53303a25-iptables-alerter-script\") pod \"iptables-alerter-p7n2z\" (UID: \"a304ded0-8dcb-4d07-b2f5-a18c53303a25\") " pod="openshift-network-operator/iptables-alerter-p7n2z" Apr 21 10:03:54.211879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209600 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-cnibin\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209600 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209625 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5xmm\" (UniqueName: \"kubernetes.io/projected/a9832641-6b46-4621-997a-be038aea43cb-kube-api-access-l5xmm\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209648 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-sysctl-conf\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209657 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31f3696c-a468-44ca-9299-c2d8a53166c8-cnibin\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209667 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b4e60e7-4ce7-450a-83fc-fd1e09de64f9-host\") pod \"node-ca-j2lqn\" (UID: \"3b4e60e7-4ce7-450a-83fc-fd1e09de64f9\") " pod="openshift-image-registry/node-ca-j2lqn" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209715 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b4e60e7-4ce7-450a-83fc-fd1e09de64f9-host\") pod \"node-ca-j2lqn\" (UID: \"3b4e60e7-4ce7-450a-83fc-fd1e09de64f9\") " pod="openshift-image-registry/node-ca-j2lqn" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209774 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-lib-modules\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209802 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd1a8429-25a8-49c9-9dda-7a286fbe2767-tmp\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209840 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-run-ovn-kubernetes\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209870 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-var-lib-cni-bin\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209884 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2f35c99-2ff3-4631-9d48-aec797338383-cni-binary-copy\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209896 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b4e60e7-4ce7-450a-83fc-fd1e09de64f9-serviceca\") pod \"node-ca-j2lqn\" (UID: \"3b4e60e7-4ce7-450a-83fc-fd1e09de64f9\") " pod="openshift-image-registry/node-ca-j2lqn" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209919 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndd28\" (UniqueName: \"kubernetes.io/projected/277f3705-c93f-4057-bbda-a7f798e4406d-kube-api-access-ndd28\") pod \"node-resolver-d9np2\" (UID: \"277f3705-c93f-4057-bbda-a7f798e4406d\") " pod="openshift-dns/node-resolver-d9np2" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209942 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-var-lib-cni-bin\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209946 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-lib-modules\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-ovnkube-config\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209980 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-host-run-ovn-kubernetes\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.212602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.209997 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-var-lib-kubelet\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.210024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/98faf1b6-99bb-4f34-822a-b471ba610d7d-agent-certs\") pod \"konnectivity-agent-vr484\" (UID: \"98faf1b6-99bb-4f34-822a-b471ba610d7d\") " pod="kube-system/konnectivity-agent-vr484" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.210049 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-modprobe-d\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.210074 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-tuned\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.210101 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/277f3705-c93f-4057-bbda-a7f798e4406d-tmp-dir\") pod \"node-resolver-d9np2\" (UID: \"277f3705-c93f-4057-bbda-a7f798e4406d\") " pod="openshift-dns/node-resolver-d9np2" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.210335 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2f35c99-2ff3-4631-9d48-aec797338383-host-var-lib-kubelet\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.210338 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-modprobe-d\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.210386 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3b4e60e7-4ce7-450a-83fc-fd1e09de64f9-serviceca\") pod \"node-ca-j2lqn\" (UID: \"3b4e60e7-4ce7-450a-83fc-fd1e09de64f9\") " pod="openshift-image-registry/node-ca-j2lqn" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.210427 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-ovnkube-config\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.210447 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/277f3705-c93f-4057-bbda-a7f798e4406d-tmp-dir\") pod \"node-resolver-d9np2\" (UID: \"277f3705-c93f-4057-bbda-a7f798e4406d\") " pod="openshift-dns/node-resolver-d9np2" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.210705 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a304ded0-8dcb-4d07-b2f5-a18c53303a25-iptables-alerter-script\") pod \"iptables-alerter-p7n2z\" (UID: \"a304ded0-8dcb-4d07-b2f5-a18c53303a25\") " pod="openshift-network-operator/iptables-alerter-p7n2z" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.210890 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e2f35c99-2ff3-4631-9d48-aec797338383-multus-daemon-config\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.212321 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd1a8429-25a8-49c9-9dda-7a286fbe2767-tmp\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.212371 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-ovn-node-metrics-cert\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.213175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.212493 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd1a8429-25a8-49c9-9dda-7a286fbe2767-etc-tuned\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.213639 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.213257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/98faf1b6-99bb-4f34-822a-b471ba610d7d-agent-certs\") pod \"konnectivity-agent-vr484\" (UID: \"98faf1b6-99bb-4f34-822a-b471ba610d7d\") " pod="kube-system/konnectivity-agent-vr484" Apr 21 10:03:54.216571 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.216541 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvwnt\" (UniqueName: \"kubernetes.io/projected/0d15ebaa-acf8-4b85-8a72-fdb57a04e985-kube-api-access-xvwnt\") pod \"ovnkube-node-l2qr9\" (UID: \"0d15ebaa-acf8-4b85-8a72-fdb57a04e985\") " pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.217974 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.217066 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbqjj\" (UniqueName: \"kubernetes.io/projected/1b8b33c9-4316-4863-843f-730d4490910b-kube-api-access-sbqjj\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:03:54.217974 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.217501 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:54.217974 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.217532 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:54.217974 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.217546 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6j8f4 for pod openshift-network-diagnostics/network-check-target-lf67k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:54.217974 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.217611 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4 podName:8cb513a2-6bf9-465c-bac3-8b87096c0e4e nodeName:}" failed. No retries permitted until 2026-04-21 10:03:54.717593944 +0000 UTC m=+3.062043808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6j8f4" (UniqueName: "kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4") pod "network-check-target-lf67k" (UID: "8cb513a2-6bf9-465c-bac3-8b87096c0e4e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:54.221178 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.220846 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5xmm\" (UniqueName: \"kubernetes.io/projected/a9832641-6b46-4621-997a-be038aea43cb-kube-api-access-l5xmm\") pod \"aws-ebs-csi-driver-node-rh9h7\" (UID: \"a9832641-6b46-4621-997a-be038aea43cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.222642 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.222596 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq4lh\" (UniqueName: \"kubernetes.io/projected/3b4e60e7-4ce7-450a-83fc-fd1e09de64f9-kube-api-access-hq4lh\") pod \"node-ca-j2lqn\" (UID: \"3b4e60e7-4ce7-450a-83fc-fd1e09de64f9\") " pod="openshift-image-registry/node-ca-j2lqn" Apr 21 10:03:54.223197 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.223008 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hbm4\" (UniqueName: \"kubernetes.io/projected/31f3696c-a468-44ca-9299-c2d8a53166c8-kube-api-access-8hbm4\") pod \"multus-additional-cni-plugins-65267\" (UID: \"31f3696c-a468-44ca-9299-c2d8a53166c8\") " pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.223197 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.223157 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gmcb\" (UniqueName: \"kubernetes.io/projected/bd1a8429-25a8-49c9-9dda-7a286fbe2767-kube-api-access-5gmcb\") pod \"tuned-z82s8\" (UID: \"bd1a8429-25a8-49c9-9dda-7a286fbe2767\") " pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.223445 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.223420 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd6pm\" (UniqueName: \"kubernetes.io/projected/e2f35c99-2ff3-4631-9d48-aec797338383-kube-api-access-zd6pm\") pod \"multus-kchl2\" (UID: \"e2f35c99-2ff3-4631-9d48-aec797338383\") " pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.223698 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.223680 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwsmm\" (UniqueName: \"kubernetes.io/projected/a304ded0-8dcb-4d07-b2f5-a18c53303a25-kube-api-access-wwsmm\") pod \"iptables-alerter-p7n2z\" (UID: \"a304ded0-8dcb-4d07-b2f5-a18c53303a25\") " pod="openshift-network-operator/iptables-alerter-p7n2z" Apr 21 10:03:54.223956 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.223935 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndd28\" (UniqueName: \"kubernetes.io/projected/277f3705-c93f-4057-bbda-a7f798e4406d-kube-api-access-ndd28\") pod \"node-resolver-d9np2\" (UID: \"277f3705-c93f-4057-bbda-a7f798e4406d\") " pod="openshift-dns/node-resolver-d9np2" Apr 21 10:03:54.395792 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.395717 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z82s8" Apr 21 10:03:54.405322 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.405295 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d9np2" Apr 21 10:03:54.411942 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.411918 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-65267" Apr 21 10:03:54.417580 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.417553 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kchl2" Apr 21 10:03:54.424262 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.424244 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:03:54.430726 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.430707 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vr484" Apr 21 10:03:54.441268 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.441251 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" Apr 21 10:03:54.447860 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.447841 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j2lqn" Apr 21 10:03:54.453416 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.453398 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p7n2z" Apr 21 10:03:54.616831 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.616799 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kknsw"] Apr 21 10:03:54.618665 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.618641 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:54.618792 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.618725 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:03:54.713146 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.713055 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-dbus\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:54.713146 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.713098 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-kubelet-config\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:54.713328 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.713200 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:54.713328 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.713221 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:03:54.713406 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.713344 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.713406 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.713404 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs podName:1b8b33c9-4316-4863-843f-730d4490910b nodeName:}" failed. No retries permitted until 2026-04-21 10:03:55.713387897 +0000 UTC m=+4.057837762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs") pod "network-metrics-daemon-vrs72" (UID: "1b8b33c9-4316-4863-843f-730d4490910b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.814254 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.814220 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:54.814397 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.814282 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-dbus\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:54.814397 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.814312 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-kubelet-config\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:54.814397 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.814358 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8f4\" (UniqueName: \"kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4\") pod \"network-check-target-lf67k\" (UID: \"8cb513a2-6bf9-465c-bac3-8b87096c0e4e\") " pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:03:54.814397 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.814387 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:54.814573 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.814470 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret podName:66ec4bf1-14e6-42c4-9174-6a6f20406a1c nodeName:}" failed. No retries permitted until 2026-04-21 10:03:55.314449015 +0000 UTC m=+3.658898901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret") pod "global-pull-secret-syncer-kknsw" (UID: "66ec4bf1-14e6-42c4-9174-6a6f20406a1c") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:54.814573 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.814473 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:54.814573 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.814493 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:54.814573 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.814508 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6j8f4 for pod openshift-network-diagnostics/network-check-target-lf67k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:54.814573 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.814512 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-kubelet-config\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:54.814573 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:54.814512 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-dbus\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:54.814573 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:54.814543 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4 podName:8cb513a2-6bf9-465c-bac3-8b87096c0e4e nodeName:}" failed. No retries permitted until 2026-04-21 10:03:55.814532001 +0000 UTC m=+4.158981869 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6j8f4" (UniqueName: "kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4") pod "network-check-target-lf67k" (UID: "8cb513a2-6bf9-465c-bac3-8b87096c0e4e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:54.892684 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:54.892467 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b4e60e7_4ce7_450a_83fc_fd1e09de64f9.slice/crio-b01a555f47302c0c432fc80988da346d71f39219635b789fe5cc46bd9f6e6ff0 WatchSource:0}: Error finding container b01a555f47302c0c432fc80988da346d71f39219635b789fe5cc46bd9f6e6ff0: Status 404 returned error can't find the container with id b01a555f47302c0c432fc80988da346d71f39219635b789fe5cc46bd9f6e6ff0 Apr 21 10:03:54.892942 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:54.892907 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9832641_6b46_4621_997a_be038aea43cb.slice/crio-b8b9e03eca5ba01ca887ace87f15bc565b309b4af749aea1f227c064e9efa637 WatchSource:0}: Error finding container b8b9e03eca5ba01ca887ace87f15bc565b309b4af749aea1f227c064e9efa637: Status 404 returned error can't find the container with id b8b9e03eca5ba01ca887ace87f15bc565b309b4af749aea1f227c064e9efa637 Apr 21 10:03:54.893786 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:54.893712 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98faf1b6_99bb_4f34_822a_b471ba610d7d.slice/crio-029b60490ea9f944bab9a8c9c72e688a4faa0effd1842709767fb2283bcb5077 WatchSource:0}: Error finding container 029b60490ea9f944bab9a8c9c72e688a4faa0effd1842709767fb2283bcb5077: Status 404 returned error can't find the container with id 029b60490ea9f944bab9a8c9c72e688a4faa0effd1842709767fb2283bcb5077 Apr 21 10:03:54.897160 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:54.897138 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31f3696c_a468_44ca_9299_c2d8a53166c8.slice/crio-b1ad6811851f42d5e69521095128631178f50ecc8371d2b3b575b6f2d7179880 WatchSource:0}: Error finding container b1ad6811851f42d5e69521095128631178f50ecc8371d2b3b575b6f2d7179880: Status 404 returned error can't find the container with id b1ad6811851f42d5e69521095128631178f50ecc8371d2b3b575b6f2d7179880 Apr 21 10:03:54.897931 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:54.897891 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d15ebaa_acf8_4b85_8a72_fdb57a04e985.slice/crio-51347f294c47e5a50d09e764041022984d4b99ac0bf31a6e628e2592f87e3abb WatchSource:0}: Error finding container 51347f294c47e5a50d09e764041022984d4b99ac0bf31a6e628e2592f87e3abb: Status 404 returned error can't find the container with id 51347f294c47e5a50d09e764041022984d4b99ac0bf31a6e628e2592f87e3abb Apr 21 10:03:54.899330 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:54.899307 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2f35c99_2ff3_4631_9d48_aec797338383.slice/crio-5b7ba6b949aafc1e2eee00ddf605d7e5aa50ad0f4dbdd9773ee2c8a97c0ef71c WatchSource:0}: Error finding container 5b7ba6b949aafc1e2eee00ddf605d7e5aa50ad0f4dbdd9773ee2c8a97c0ef71c: Status 404 returned error can't find the container with id 5b7ba6b949aafc1e2eee00ddf605d7e5aa50ad0f4dbdd9773ee2c8a97c0ef71c Apr 21 10:03:54.899780 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:54.899677 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod277f3705_c93f_4057_bbda_a7f798e4406d.slice/crio-7757799d499f85f89e3cafa0f0bdabfd1347b8943a5fb1a512c491260cb8b956 WatchSource:0}: Error finding container 7757799d499f85f89e3cafa0f0bdabfd1347b8943a5fb1a512c491260cb8b956: Status 404 returned error can't find the container with id 7757799d499f85f89e3cafa0f0bdabfd1347b8943a5fb1a512c491260cb8b956 Apr 21 10:03:54.901700 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:54.900904 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd1a8429_25a8_49c9_9dda_7a286fbe2767.slice/crio-12070baa012322be9b06add8110bdc1d22add1c7fdc2150fc829ba089c9c18d9 WatchSource:0}: Error finding container 12070baa012322be9b06add8110bdc1d22add1c7fdc2150fc829ba089c9c18d9: Status 404 returned error can't find the container with id 12070baa012322be9b06add8110bdc1d22add1c7fdc2150fc829ba089c9c18d9 Apr 21 10:03:54.902081 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:03:54.902052 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda304ded0_8dcb_4d07_b2f5_a18c53303a25.slice/crio-49fd75c2f130b4e0a43f828f920d1045031daccf87a0a97133a62dce3f4a79ed WatchSource:0}: Error finding container 49fd75c2f130b4e0a43f828f920d1045031daccf87a0a97133a62dce3f4a79ed: Status 404 returned error can't find the container with id 49fd75c2f130b4e0a43f828f920d1045031daccf87a0a97133a62dce3f4a79ed Apr 21 10:03:55.147403 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.147372 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:53 +0000 UTC" deadline="2027-12-10 21:16:52.124323629 +0000 UTC" Apr 21 10:03:55.147403 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.147398 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14363h12m56.976927427s" Apr 21 10:03:55.252775 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.252217 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:03:55.252775 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:55.252358 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:03:55.262680 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.262618 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-84.ec2.internal" event={"ID":"366ead18e7fb69eb4a529be7bbd9f14e","Type":"ContainerStarted","Data":"63b2ec16bbea4cee324107dd2f810ba784bacf33d1bc6dac89e50882f6cf0f59"} Apr 21 10:03:55.271512 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.271481 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z82s8" event={"ID":"bd1a8429-25a8-49c9-9dda-7a286fbe2767","Type":"ContainerStarted","Data":"12070baa012322be9b06add8110bdc1d22add1c7fdc2150fc829ba089c9c18d9"} Apr 21 10:03:55.282482 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.280217 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-84.ec2.internal" podStartSLOduration=2.280204895 podStartE2EDuration="2.280204895s" podCreationTimestamp="2026-04-21 10:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:03:55.280018903 +0000 UTC m=+3.624468788" watchObservedRunningTime="2026-04-21 10:03:55.280204895 +0000 UTC m=+3.624654780" Apr 21 10:03:55.284664 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.284636 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d9np2" event={"ID":"277f3705-c93f-4057-bbda-a7f798e4406d","Type":"ContainerStarted","Data":"7757799d499f85f89e3cafa0f0bdabfd1347b8943a5fb1a512c491260cb8b956"} Apr 21 10:03:55.288828 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.288770 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" event={"ID":"0d15ebaa-acf8-4b85-8a72-fdb57a04e985","Type":"ContainerStarted","Data":"51347f294c47e5a50d09e764041022984d4b99ac0bf31a6e628e2592f87e3abb"} Apr 21 10:03:55.297073 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.297022 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vr484" event={"ID":"98faf1b6-99bb-4f34-822a-b471ba610d7d","Type":"ContainerStarted","Data":"029b60490ea9f944bab9a8c9c72e688a4faa0effd1842709767fb2283bcb5077"} Apr 21 10:03:55.308313 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.308245 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j2lqn" event={"ID":"3b4e60e7-4ce7-450a-83fc-fd1e09de64f9","Type":"ContainerStarted","Data":"b01a555f47302c0c432fc80988da346d71f39219635b789fe5cc46bd9f6e6ff0"} Apr 21 10:03:55.309578 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.309520 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p7n2z" event={"ID":"a304ded0-8dcb-4d07-b2f5-a18c53303a25","Type":"ContainerStarted","Data":"49fd75c2f130b4e0a43f828f920d1045031daccf87a0a97133a62dce3f4a79ed"} Apr 21 10:03:55.311399 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.311351 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kchl2" event={"ID":"e2f35c99-2ff3-4631-9d48-aec797338383","Type":"ContainerStarted","Data":"5b7ba6b949aafc1e2eee00ddf605d7e5aa50ad0f4dbdd9773ee2c8a97c0ef71c"} Apr 21 10:03:55.314272 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.314237 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65267" event={"ID":"31f3696c-a468-44ca-9299-c2d8a53166c8","Type":"ContainerStarted","Data":"b1ad6811851f42d5e69521095128631178f50ecc8371d2b3b575b6f2d7179880"} Apr 21 10:03:55.321463 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.321437 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:55.322004 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:55.321582 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:55.322004 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:55.321677 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret podName:66ec4bf1-14e6-42c4-9174-6a6f20406a1c nodeName:}" failed. No retries permitted until 2026-04-21 10:03:56.321658678 +0000 UTC m=+4.666108548 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret") pod "global-pull-secret-syncer-kknsw" (UID: "66ec4bf1-14e6-42c4-9174-6a6f20406a1c") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:55.327375 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.327334 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" event={"ID":"a9832641-6b46-4621-997a-be038aea43cb","Type":"ContainerStarted","Data":"b8b9e03eca5ba01ca887ace87f15bc565b309b4af749aea1f227c064e9efa637"} Apr 21 10:03:55.724039 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.724008 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:03:55.724225 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:55.724209 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:55.724295 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:55.724269 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs podName:1b8b33c9-4316-4863-843f-730d4490910b nodeName:}" failed. No retries permitted until 2026-04-21 10:03:57.724249831 +0000 UTC m=+6.068699697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs") pod "network-metrics-daemon-vrs72" (UID: "1b8b33c9-4316-4863-843f-730d4490910b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:55.825242 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:55.825210 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8f4\" (UniqueName: \"kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4\") pod \"network-check-target-lf67k\" (UID: \"8cb513a2-6bf9-465c-bac3-8b87096c0e4e\") " pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:03:55.825384 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:55.825365 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:55.825451 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:55.825393 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:55.825451 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:55.825407 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6j8f4 for pod openshift-network-diagnostics/network-check-target-lf67k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:55.825544 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:55.825472 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4 podName:8cb513a2-6bf9-465c-bac3-8b87096c0e4e nodeName:}" failed. No retries permitted until 2026-04-21 10:03:57.825452996 +0000 UTC m=+6.169902893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6j8f4" (UniqueName: "kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4") pod "network-check-target-lf67k" (UID: "8cb513a2-6bf9-465c-bac3-8b87096c0e4e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:56.255476 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:56.255443 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:03:56.255964 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:56.255584 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:03:56.256024 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:56.256011 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:56.256173 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:56.256099 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:03:56.329855 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:56.329315 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:56.329855 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:56.329470 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:56.329855 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:56.329540 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret podName:66ec4bf1-14e6-42c4-9174-6a6f20406a1c nodeName:}" failed. No retries permitted until 2026-04-21 10:03:58.329518803 +0000 UTC m=+6.673968667 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret") pod "global-pull-secret-syncer-kknsw" (UID: "66ec4bf1-14e6-42c4-9174-6a6f20406a1c") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:56.344371 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:56.344340 2567 generic.go:358] "Generic (PLEG): container finished" podID="e84065e070e871c05028ff6c3f13e37a" containerID="2d8ffcfd8a5c326f92d0450b916786d9bcfa87d2c86e592b6cba5f857c6f2941" exitCode=0 Apr 21 10:03:56.345133 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:56.344858 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" event={"ID":"e84065e070e871c05028ff6c3f13e37a","Type":"ContainerDied","Data":"2d8ffcfd8a5c326f92d0450b916786d9bcfa87d2c86e592b6cba5f857c6f2941"} Apr 21 10:03:57.252401 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:57.252346 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:03:57.252578 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:57.252476 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:03:57.364466 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:57.364424 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" event={"ID":"e84065e070e871c05028ff6c3f13e37a","Type":"ContainerStarted","Data":"84beca1b54f5ea0874e3887f2659722c3ea9b5d523639fdc30f547e629a43e47"} Apr 21 10:03:57.742280 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:57.742201 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:03:57.742435 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:57.742353 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:57.742435 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:57.742416 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs podName:1b8b33c9-4316-4863-843f-730d4490910b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:01.742398287 +0000 UTC m=+10.086848165 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs") pod "network-metrics-daemon-vrs72" (UID: "1b8b33c9-4316-4863-843f-730d4490910b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:57.842843 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:57.842807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8f4\" (UniqueName: \"kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4\") pod \"network-check-target-lf67k\" (UID: \"8cb513a2-6bf9-465c-bac3-8b87096c0e4e\") " pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:03:57.843006 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:57.842972 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:57.843006 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:57.842992 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:57.843006 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:57.843004 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6j8f4 for pod openshift-network-diagnostics/network-check-target-lf67k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:57.843189 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:57.843065 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4 podName:8cb513a2-6bf9-465c-bac3-8b87096c0e4e nodeName:}" failed. No retries permitted until 2026-04-21 10:04:01.843044696 +0000 UTC m=+10.187494561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6j8f4" (UniqueName: "kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4") pod "network-check-target-lf67k" (UID: "8cb513a2-6bf9-465c-bac3-8b87096c0e4e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:58.252952 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:58.252921 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:58.253138 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:58.252931 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:03:58.253138 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:58.253050 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:03:58.256314 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:58.255391 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:03:58.348082 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:58.347557 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:03:58.348082 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:58.347714 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:58.348082 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:58.347772 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret podName:66ec4bf1-14e6-42c4-9174-6a6f20406a1c nodeName:}" failed. No retries permitted until 2026-04-21 10:04:02.347754695 +0000 UTC m=+10.692204561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret") pod "global-pull-secret-syncer-kknsw" (UID: "66ec4bf1-14e6-42c4-9174-6a6f20406a1c") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:03:59.252865 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:03:59.252608 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:03:59.252865 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:03:59.252739 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:00.253512 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:00.253021 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:00.253512 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:00.253175 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:00.253512 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:00.253494 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:00.254053 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:00.253604 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:01.252933 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:01.252854 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:01.253149 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:01.252988 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:01.776779 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:01.776727 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:01.777205 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:01.776885 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:01.777205 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:01.776963 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs podName:1b8b33c9-4316-4863-843f-730d4490910b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:09.776941118 +0000 UTC m=+18.121390984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs") pod "network-metrics-daemon-vrs72" (UID: "1b8b33c9-4316-4863-843f-730d4490910b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:01.878098 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:01.877998 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8f4\" (UniqueName: \"kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4\") pod \"network-check-target-lf67k\" (UID: \"8cb513a2-6bf9-465c-bac3-8b87096c0e4e\") " pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:01.878271 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:01.878199 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:01.878271 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:01.878227 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:01.878271 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:01.878240 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6j8f4 for pod openshift-network-diagnostics/network-check-target-lf67k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:01.878432 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:01.878298 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4 podName:8cb513a2-6bf9-465c-bac3-8b87096c0e4e nodeName:}" failed. No retries permitted until 2026-04-21 10:04:09.878280208 +0000 UTC m=+18.222730076 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6j8f4" (UniqueName: "kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4") pod "network-check-target-lf67k" (UID: "8cb513a2-6bf9-465c-bac3-8b87096c0e4e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:02.253603 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:02.253564 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:02.253766 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:02.253695 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:02.254009 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:02.253984 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:02.254139 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:02.254102 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:02.383563 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:02.383481 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:02.383726 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:02.383624 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:02.383726 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:02.383682 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret podName:66ec4bf1-14e6-42c4-9174-6a6f20406a1c nodeName:}" failed. No retries permitted until 2026-04-21 10:04:10.383663838 +0000 UTC m=+18.728113704 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret") pod "global-pull-secret-syncer-kknsw" (UID: "66ec4bf1-14e6-42c4-9174-6a6f20406a1c") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:03.252315 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:03.252275 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:03.252754 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:03.252405 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:04.252558 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:04.252481 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:04.252945 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:04.252497 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:04.252945 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:04.252627 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:04.252945 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:04.252717 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:05.252805 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:05.252772 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:05.253215 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:05.252898 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:06.252888 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:06.252847 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:06.253351 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:06.252960 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:06.253351 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:06.253049 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:06.253351 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:06.253178 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:07.252816 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:07.252781 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:07.253007 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:07.252898 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:08.252939 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:08.252906 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:08.253142 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:08.253024 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:08.253142 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:08.253062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:08.253548 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:08.253162 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:09.252900 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:09.252865 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:09.253070 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:09.252992 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:09.840737 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:09.840706 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:09.841198 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:09.840843 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:09.841198 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:09.840905 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs podName:1b8b33c9-4316-4863-843f-730d4490910b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:25.840891505 +0000 UTC m=+34.185341372 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs") pod "network-metrics-daemon-vrs72" (UID: "1b8b33c9-4316-4863-843f-730d4490910b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:09.941412 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:09.941382 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8f4\" (UniqueName: \"kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4\") pod \"network-check-target-lf67k\" (UID: \"8cb513a2-6bf9-465c-bac3-8b87096c0e4e\") " pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:09.941575 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:09.941561 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:09.941652 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:09.941586 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:09.941652 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:09.941599 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6j8f4 for pod openshift-network-diagnostics/network-check-target-lf67k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:09.941750 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:09.941661 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4 podName:8cb513a2-6bf9-465c-bac3-8b87096c0e4e nodeName:}" failed. No retries permitted until 2026-04-21 10:04:25.941641985 +0000 UTC m=+34.286091860 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6j8f4" (UniqueName: "kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4") pod "network-check-target-lf67k" (UID: "8cb513a2-6bf9-465c-bac3-8b87096c0e4e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:10.252556 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:10.252478 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:10.252764 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:10.252474 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:10.252764 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:10.252626 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:10.252764 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:10.252697 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:10.446125 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:10.446068 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:10.446302 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:10.446274 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:10.446365 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:10.446353 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret podName:66ec4bf1-14e6-42c4-9174-6a6f20406a1c nodeName:}" failed. No retries permitted until 2026-04-21 10:04:26.446332475 +0000 UTC m=+34.790782351 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret") pod "global-pull-secret-syncer-kknsw" (UID: "66ec4bf1-14e6-42c4-9174-6a6f20406a1c") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:11.252323 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:11.252293 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:11.252720 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:11.252406 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:12.253485 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.253218 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:12.253977 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:12.253544 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:12.253977 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.253302 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:12.253977 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:12.253619 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:12.395280 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.395194 2567 generic.go:358] "Generic (PLEG): container finished" podID="31f3696c-a468-44ca-9299-c2d8a53166c8" containerID="e829bb9925988deddf4cf27f7f8a336cc564b39875482e6192fae0932878c11a" exitCode=0 Apr 21 10:04:12.395471 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.395283 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65267" event={"ID":"31f3696c-a468-44ca-9299-c2d8a53166c8","Type":"ContainerDied","Data":"e829bb9925988deddf4cf27f7f8a336cc564b39875482e6192fae0932878c11a"} Apr 21 10:04:12.396805 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.396776 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" event={"ID":"a9832641-6b46-4621-997a-be038aea43cb","Type":"ContainerStarted","Data":"49334ef39d68073f99b321354bc0fbfa434d3deabcccaacafccd54338ca05fc5"} Apr 21 10:04:12.398185 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.398158 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z82s8" event={"ID":"bd1a8429-25a8-49c9-9dda-7a286fbe2767","Type":"ContainerStarted","Data":"36089dcb69073785e8dce366dbf055156d1672edaf91509efe776f368d80b8b0"} Apr 21 10:04:12.399609 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.399577 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d9np2" event={"ID":"277f3705-c93f-4057-bbda-a7f798e4406d","Type":"ContainerStarted","Data":"9472967b0f50e8a471d530851755909cfdf49de280757d5d92660cb6b1cae616"} Apr 21 10:04:12.402246 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.402230 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:04:12.402533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.402515 2567 generic.go:358] "Generic (PLEG): container finished" podID="0d15ebaa-acf8-4b85-8a72-fdb57a04e985" containerID="182d68c25e7e058f5e6cf5d3f5b0739674d86a21a21e2a96d0a46320154ef4c1" exitCode=1 Apr 21 10:04:12.402588 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.402579 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" event={"ID":"0d15ebaa-acf8-4b85-8a72-fdb57a04e985","Type":"ContainerStarted","Data":"4810f7550876b65a0ced5850a979d9dd2ab441d75719d46c14044710beb93c44"} Apr 21 10:04:12.402637 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.402599 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" event={"ID":"0d15ebaa-acf8-4b85-8a72-fdb57a04e985","Type":"ContainerStarted","Data":"aa1494b67e8cc2ffe5100ae8ebc34d12e19cc0205e7a968738c5c28cdae7858a"} Apr 21 10:04:12.402637 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.402610 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" event={"ID":"0d15ebaa-acf8-4b85-8a72-fdb57a04e985","Type":"ContainerStarted","Data":"3319617b19c74bb607045f8fc36cd9e7cb00e0e34ff637cf9e4d46e6908d2f37"} Apr 21 10:04:12.402637 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.402618 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" event={"ID":"0d15ebaa-acf8-4b85-8a72-fdb57a04e985","Type":"ContainerStarted","Data":"0211cea3886995e74d710a4a116a2c79aab59fc732a5b3457384d0f9bcb90471"} Apr 21 10:04:12.402637 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.402626 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" event={"ID":"0d15ebaa-acf8-4b85-8a72-fdb57a04e985","Type":"ContainerDied","Data":"182d68c25e7e058f5e6cf5d3f5b0739674d86a21a21e2a96d0a46320154ef4c1"} Apr 21 10:04:12.402784 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.402638 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" event={"ID":"0d15ebaa-acf8-4b85-8a72-fdb57a04e985","Type":"ContainerStarted","Data":"260986ce6163d3209f94c6cb8cd608a70dfd33fe4ead4a16cf9e37fe2f94e827"} Apr 21 10:04:12.403658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.403639 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vr484" event={"ID":"98faf1b6-99bb-4f34-822a-b471ba610d7d","Type":"ContainerStarted","Data":"ff46533bf1b7dfe3f47dfcd9918df1dec89e3f9046589bfb36615b7e208e0c03"} Apr 21 10:04:12.404788 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.404769 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j2lqn" event={"ID":"3b4e60e7-4ce7-450a-83fc-fd1e09de64f9","Type":"ContainerStarted","Data":"4e5dc0a1c47ecffbcf2298875ac6e577fba74887c9e02621fb16e8830891c435"} Apr 21 10:04:12.405945 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.405928 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kchl2" event={"ID":"e2f35c99-2ff3-4631-9d48-aec797338383","Type":"ContainerStarted","Data":"c213aed30bb2eca8b45d779fc4fd5cb069ba865cfda1453849bd942140f78e26"} Apr 21 10:04:12.423203 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.423168 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-84.ec2.internal" podStartSLOduration=19.423156543 podStartE2EDuration="19.423156543s" podCreationTimestamp="2026-04-21 10:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:03:57.381091268 +0000 UTC m=+5.725541155" watchObservedRunningTime="2026-04-21 10:04:12.423156543 +0000 UTC m=+20.767606428" Apr 21 10:04:12.438901 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.438868 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d9np2" podStartSLOduration=3.817563442 podStartE2EDuration="20.438856179s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.902382806 +0000 UTC m=+3.246832684" lastFinishedPulling="2026-04-21 10:04:11.523675557 +0000 UTC m=+19.868125421" observedRunningTime="2026-04-21 10:04:12.438389985 +0000 UTC m=+20.782839871" watchObservedRunningTime="2026-04-21 10:04:12.438856179 +0000 UTC m=+20.783306063" Apr 21 10:04:12.458733 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.458699 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kchl2" podStartSLOduration=3.790087345 podStartE2EDuration="20.458689033s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.901559685 +0000 UTC m=+3.246009562" lastFinishedPulling="2026-04-21 10:04:11.570161387 +0000 UTC m=+19.914611250" observedRunningTime="2026-04-21 10:04:12.458617366 +0000 UTC m=+20.803067250" watchObservedRunningTime="2026-04-21 10:04:12.458689033 +0000 UTC m=+20.803138918" Apr 21 10:04:12.475141 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.475091 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vr484" podStartSLOduration=3.84787999 podStartE2EDuration="20.475081095s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.896680023 +0000 UTC m=+3.241129891" lastFinishedPulling="2026-04-21 10:04:11.523881112 +0000 UTC m=+19.868330996" observedRunningTime="2026-04-21 10:04:12.474990891 +0000 UTC m=+20.819440775" watchObservedRunningTime="2026-04-21 10:04:12.475081095 +0000 UTC m=+20.819530984" Apr 21 10:04:12.495532 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.495500 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j2lqn" podStartSLOduration=11.627823943 podStartE2EDuration="20.495490959s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.894288257 +0000 UTC m=+3.238738125" lastFinishedPulling="2026-04-21 10:04:03.761955275 +0000 UTC m=+12.106405141" observedRunningTime="2026-04-21 10:04:12.49521683 +0000 UTC m=+20.839666715" watchObservedRunningTime="2026-04-21 10:04:12.495490959 +0000 UTC m=+20.839940843" Apr 21 10:04:12.518909 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:12.518857 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-z82s8" podStartSLOduration=3.8528313880000002 podStartE2EDuration="20.51884092s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.903270818 +0000 UTC m=+3.247720694" lastFinishedPulling="2026-04-21 10:04:11.569280345 +0000 UTC m=+19.913730226" observedRunningTime="2026-04-21 10:04:12.518689564 +0000 UTC m=+20.863139440" watchObservedRunningTime="2026-04-21 10:04:12.51884092 +0000 UTC m=+20.863290808" Apr 21 10:04:13.252886 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:13.252752 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:13.253010 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:13.252984 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:13.309609 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:13.309580 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 10:04:13.409429 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:13.409393 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p7n2z" event={"ID":"a304ded0-8dcb-4d07-b2f5-a18c53303a25","Type":"ContainerStarted","Data":"0d5ab0b0bd7d2666b8752ab79c124c311dec4a8a520093ce7e7da46f65ad8e5f"} Apr 21 10:04:13.411150 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:13.411122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" event={"ID":"a9832641-6b46-4621-997a-be038aea43cb","Type":"ContainerStarted","Data":"c4975e9b34f10f2c52d76f8ba833d7c6a8bfca6b365c4daaf700e6fb9fecf064"} Apr 21 10:04:13.766722 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:13.766691 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vr484" Apr 21 10:04:13.767268 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:13.767054 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vr484" Apr 21 10:04:13.782733 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:13.782682 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-p7n2z" podStartSLOduration=5.163034335 podStartE2EDuration="21.78266441s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.904188379 +0000 UTC m=+3.248638260" lastFinishedPulling="2026-04-21 10:04:11.523818467 +0000 UTC m=+19.868268335" observedRunningTime="2026-04-21 10:04:13.426130541 +0000 UTC m=+21.770580424" watchObservedRunningTime="2026-04-21 10:04:13.78266441 +0000 UTC m=+22.127114288" Apr 21 10:04:14.175298 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:14.175158 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T10:04:13.30960211Z","UUID":"6c811d41-cc54-494a-a71d-d62b8b82053d","Handler":null,"Name":"","Endpoint":""} Apr 21 10:04:14.177379 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:14.177354 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 10:04:14.177379 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:14.177380 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 10:04:14.252564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:14.252532 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:14.252725 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:14.252575 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:14.252725 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:14.252676 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:14.252843 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:14.252790 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:14.413040 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:14.412927 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vr484" Apr 21 10:04:14.413637 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:14.413620 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vr484" Apr 21 10:04:15.252351 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:15.252317 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:15.252515 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:15.252447 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:15.416560 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:15.416466 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" event={"ID":"a9832641-6b46-4621-997a-be038aea43cb","Type":"ContainerStarted","Data":"524d07e852e4465d5f96373225823c9ba45973270425006b795f66cceff3edfd"} Apr 21 10:04:15.419522 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:15.419496 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:04:15.419863 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:15.419836 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" event={"ID":"0d15ebaa-acf8-4b85-8a72-fdb57a04e985","Type":"ContainerStarted","Data":"523a091b4f089a8d4d9dad5ce7c052ecff58fa5128f794fcf9404bd5605016b4"} Apr 21 10:04:15.434295 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:15.434248 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rh9h7" podStartSLOduration=3.878726373 podStartE2EDuration="23.434232664s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.894894267 +0000 UTC m=+3.239344142" lastFinishedPulling="2026-04-21 10:04:14.450400558 +0000 UTC m=+22.794850433" observedRunningTime="2026-04-21 10:04:15.433358523 +0000 UTC m=+23.777808419" watchObservedRunningTime="2026-04-21 10:04:15.434232664 +0000 UTC m=+23.778682552" Apr 21 10:04:16.252659 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:16.252628 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:16.252659 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:16.252655 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:16.252862 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:16.252753 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:16.252943 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:16.252914 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:17.252954 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:17.252764 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:17.253392 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:17.252986 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:17.425741 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:17.425712 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:04:17.426060 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:17.426034 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" event={"ID":"0d15ebaa-acf8-4b85-8a72-fdb57a04e985","Type":"ContainerStarted","Data":"aaa7686d36ce34dc2e189cdf678f3d0abfe0f3c639639576f7af627d99fd75dd"} Apr 21 10:04:17.426363 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:17.426341 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:04:17.426363 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:17.426366 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:04:17.426544 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:17.426517 2567 scope.go:117] "RemoveContainer" containerID="182d68c25e7e058f5e6cf5d3f5b0739674d86a21a21e2a96d0a46320154ef4c1" Apr 21 10:04:17.427736 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:17.427714 2567 generic.go:358] "Generic (PLEG): container finished" podID="31f3696c-a468-44ca-9299-c2d8a53166c8" containerID="bae2a1d2fe9f431f9064cbd0e01e3712404605601e64ba864fb4ff33ff1d86ed" exitCode=0 Apr 21 10:04:17.427832 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:17.427744 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65267" event={"ID":"31f3696c-a468-44ca-9299-c2d8a53166c8","Type":"ContainerDied","Data":"bae2a1d2fe9f431f9064cbd0e01e3712404605601e64ba864fb4ff33ff1d86ed"} Apr 21 10:04:17.443038 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:17.443020 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:04:18.252479 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.252452 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:18.252648 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.252501 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:18.252648 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:18.252575 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:18.252764 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:18.252674 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:18.432162 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.432140 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:04:18.432542 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.432476 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" event={"ID":"0d15ebaa-acf8-4b85-8a72-fdb57a04e985","Type":"ContainerStarted","Data":"1f52a130c35030af09b2ffef8e47f7540c508e0d25806d3775343b66ba9afb19"} Apr 21 10:04:18.432785 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.432757 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:04:18.447069 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.447049 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:04:18.500795 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.500741 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" podStartSLOduration=9.757616657 podStartE2EDuration="26.50072314s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.900221992 +0000 UTC m=+3.244671866" lastFinishedPulling="2026-04-21 10:04:11.643328475 +0000 UTC m=+19.987778349" observedRunningTime="2026-04-21 10:04:18.468385405 +0000 UTC m=+26.812835290" watchObservedRunningTime="2026-04-21 10:04:18.50072314 +0000 UTC m=+26.845173025" Apr 21 10:04:18.796035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.795848 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vrs72"] Apr 21 10:04:18.796212 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.796081 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:18.796274 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:18.796213 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:18.798894 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.798863 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lf67k"] Apr 21 10:04:18.799031 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.798979 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:18.799092 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:18.799055 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:18.799500 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.799479 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kknsw"] Apr 21 10:04:18.799575 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:18.799562 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:18.799677 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:18.799656 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:19.439277 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:19.439244 2567 generic.go:358] "Generic (PLEG): container finished" podID="31f3696c-a468-44ca-9299-c2d8a53166c8" containerID="05e3bca715e84f277c4826a019533c4615881588f367093e8d15b0436573d768" exitCode=0 Apr 21 10:04:19.439645 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:19.439319 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65267" event={"ID":"31f3696c-a468-44ca-9299-c2d8a53166c8","Type":"ContainerDied","Data":"05e3bca715e84f277c4826a019533c4615881588f367093e8d15b0436573d768"} Apr 21 10:04:20.253195 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:20.253167 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:20.253342 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:20.253168 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:20.253383 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:20.253257 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:20.253415 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:20.253389 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:20.661925 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:20.661902 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d9np2_277f3705-c93f-4057-bbda-a7f798e4406d/dns-node-resolver/0.log" Apr 21 10:04:21.253081 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:21.253055 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:21.253221 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:21.253181 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:21.444445 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:21.444414 2567 generic.go:358] "Generic (PLEG): container finished" podID="31f3696c-a468-44ca-9299-c2d8a53166c8" containerID="4dc2d6c389b77092eee17d41c2ee33bb9081fdac4ac62784c04f3db3de5eb3a8" exitCode=0 Apr 21 10:04:21.444548 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:21.444491 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65267" event={"ID":"31f3696c-a468-44ca-9299-c2d8a53166c8","Type":"ContainerDied","Data":"4dc2d6c389b77092eee17d41c2ee33bb9081fdac4ac62784c04f3db3de5eb3a8"} Apr 21 10:04:22.045873 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:22.045848 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j2lqn_3b4e60e7-4ce7-450a-83fc-fd1e09de64f9/node-ca/0.log" Apr 21 10:04:22.253738 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:22.253697 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:22.253903 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:22.253807 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:22.253903 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:22.253863 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:22.254012 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:22.253947 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:23.252397 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:23.252323 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:23.252846 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:23.252453 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:24.252334 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:24.252303 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:24.252597 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:24.252303 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:24.252597 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:24.252436 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:24.252597 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:24.252511 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:25.252178 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:25.252150 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:25.252363 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:25.252267 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:25.865914 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:25.865868 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:25.866363 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:25.866041 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:25.866363 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:25.866133 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs podName:1b8b33c9-4316-4863-843f-730d4490910b nodeName:}" failed. No retries permitted until 2026-04-21 10:04:57.86609815 +0000 UTC m=+66.210548024 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs") pod "network-metrics-daemon-vrs72" (UID: "1b8b33c9-4316-4863-843f-730d4490910b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:25.966710 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:25.966679 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8f4\" (UniqueName: \"kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4\") pod \"network-check-target-lf67k\" (UID: \"8cb513a2-6bf9-465c-bac3-8b87096c0e4e\") " pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:25.966886 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:25.966813 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:25.966886 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:25.966830 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:25.966886 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:25.966840 2567 projected.go:194] Error preparing data for projected volume kube-api-access-6j8f4 for pod openshift-network-diagnostics/network-check-target-lf67k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:25.967050 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:25.966890 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4 podName:8cb513a2-6bf9-465c-bac3-8b87096c0e4e nodeName:}" failed. No retries permitted until 2026-04-21 10:04:57.966875829 +0000 UTC m=+66.311325692 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6j8f4" (UniqueName: "kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4") pod "network-check-target-lf67k" (UID: "8cb513a2-6bf9-465c-bac3-8b87096c0e4e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:26.252857 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:26.252777 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:26.253006 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:26.252777 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:26.253006 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:26.252917 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:26.253006 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:26.252993 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:26.470203 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:26.470167 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:26.470360 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:26.470302 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:26.470360 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:26.470359 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret podName:66ec4bf1-14e6-42c4-9174-6a6f20406a1c nodeName:}" failed. No retries permitted until 2026-04-21 10:04:58.470341164 +0000 UTC m=+66.814791028 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret") pod "global-pull-secret-syncer-kknsw" (UID: "66ec4bf1-14e6-42c4-9174-6a6f20406a1c") : object "kube-system"/"original-pull-secret" not registered Apr 21 10:04:27.252152 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:27.252124 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:27.252532 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:27.252241 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:27.458721 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:27.458485 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65267" event={"ID":"31f3696c-a468-44ca-9299-c2d8a53166c8","Type":"ContainerStarted","Data":"970c058ca6098126065d09977e29508b6e0e7cf1cf6c2a28f519fcf9ac712cad"} Apr 21 10:04:28.252499 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:28.252466 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:28.252889 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:28.252472 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:28.252889 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:28.252565 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:28.252889 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:28.252681 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:28.462976 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:28.462945 2567 generic.go:358] "Generic (PLEG): container finished" podID="31f3696c-a468-44ca-9299-c2d8a53166c8" containerID="970c058ca6098126065d09977e29508b6e0e7cf1cf6c2a28f519fcf9ac712cad" exitCode=0 Apr 21 10:04:28.463133 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:28.463001 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65267" event={"ID":"31f3696c-a468-44ca-9299-c2d8a53166c8","Type":"ContainerDied","Data":"970c058ca6098126065d09977e29508b6e0e7cf1cf6c2a28f519fcf9ac712cad"} Apr 21 10:04:29.253149 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:29.253101 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:29.253557 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:29.253215 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:29.467501 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:29.467473 2567 generic.go:358] "Generic (PLEG): container finished" podID="31f3696c-a468-44ca-9299-c2d8a53166c8" containerID="c5a2cffcbd18cb90889045e62a344f40543c99ec02d0458f61ef711213f17769" exitCode=0 Apr 21 10:04:29.467653 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:29.467520 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65267" event={"ID":"31f3696c-a468-44ca-9299-c2d8a53166c8","Type":"ContainerDied","Data":"c5a2cffcbd18cb90889045e62a344f40543c99ec02d0458f61ef711213f17769"} Apr 21 10:04:30.253079 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:30.253047 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:30.253258 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:30.253047 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:30.253258 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:30.253191 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:30.253258 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:30.253209 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:30.471564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:30.471532 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-65267" event={"ID":"31f3696c-a468-44ca-9299-c2d8a53166c8","Type":"ContainerStarted","Data":"7ebdae92f0868affc719a699a129c0e0c5f6062d06bf15144a1a3ff3d07ca889"} Apr 21 10:04:30.497648 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:30.497612 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-65267" podStartSLOduration=6.122557226 podStartE2EDuration="38.497600694s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.899313178 +0000 UTC m=+3.243763055" lastFinishedPulling="2026-04-21 10:04:27.274356652 +0000 UTC m=+35.618806523" observedRunningTime="2026-04-21 10:04:30.497266425 +0000 UTC m=+38.841716336" watchObservedRunningTime="2026-04-21 10:04:30.497600694 +0000 UTC m=+38.842050578" Apr 21 10:04:31.252290 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:31.252258 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:31.252460 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:31.252354 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:32.253203 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:32.253170 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:32.253633 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:32.253242 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:32.253633 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:32.253317 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:32.253633 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:32.253398 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:33.252157 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:33.252128 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:33.252315 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:33.252215 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:34.252692 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:34.252665 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:34.253051 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:34.252667 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:34.253051 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:34.252758 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:34.253051 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:34.252834 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:35.252488 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:35.252458 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:35.252646 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:35.252548 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:36.252707 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:36.252682 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:36.253055 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:36.252785 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:36.253055 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:36.252851 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:36.253055 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:36.252953 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:37.252617 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:37.252580 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:37.252804 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:37.252700 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:38.252653 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:38.252620 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:38.252813 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:38.252755 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:38.253060 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:38.252808 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:38.253060 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:38.252885 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:39.252676 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:39.252644 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:39.252853 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:39.252749 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:40.253187 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:40.253156 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:40.253602 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:40.253164 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:40.253602 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:40.253245 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:40.253602 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:40.253335 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:41.252628 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:41.252600 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:41.252783 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:41.252688 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:42.254882 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:42.254855 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:42.255256 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:42.254932 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lf67k" podUID="8cb513a2-6bf9-465c-bac3-8b87096c0e4e" Apr 21 10:04:42.255256 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:42.254855 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:42.255256 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:42.255000 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vrs72" podUID="1b8b33c9-4316-4863-843f-730d4490910b" Apr 21 10:04:43.252910 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.252879 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:43.253073 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:43.253016 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kknsw" podUID="66ec4bf1-14e6-42c4-9174-6a6f20406a1c" Apr 21 10:04:43.468381 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.468350 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-84.ec2.internal" event="NodeReady" Apr 21 10:04:43.468738 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.468466 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 10:04:43.518668 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.518608 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7776896ff5-2q4gp"] Apr 21 10:04:43.548908 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.548887 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-x974r"] Apr 21 10:04:43.549034 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.549006 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.551753 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.551731 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 10:04:43.551882 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.551771 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 10:04:43.551979 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.551963 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 10:04:43.552129 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.552087 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qvkrb\"" Apr 21 10:04:43.558261 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.558245 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 10:04:43.574650 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.574630 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7776896ff5-2q4gp"] Apr 21 10:04:43.574752 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.574663 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lzplb"] Apr 21 10:04:43.574752 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.574684 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.577988 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.577969 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 10:04:43.578175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.577971 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 10:04:43.578175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.578057 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 10:04:43.578530 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.578514 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 10:04:43.578583 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.578532 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-l8m2h\"" Apr 21 10:04:43.599681 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.599660 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x974r"] Apr 21 10:04:43.599772 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.599684 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lzplb"] Apr 21 10:04:43.599831 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.599790 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.604153 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.604133 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 10:04:43.604286 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.604255 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 10:04:43.604375 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.604359 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nglr4\"" Apr 21 10:04:43.605626 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.605609 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/095a4938-5652-428e-9f3f-d766898b0bab-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.605694 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.605637 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37210912-0c16-4531-be5a-e4e0262a5e52-bound-sa-token\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.605694 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.605656 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/37210912-0c16-4531-be5a-e4e0262a5e52-ca-trust-extracted\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.605694 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.605681 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/095a4938-5652-428e-9f3f-d766898b0bab-data-volume\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.605832 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.605699 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/37210912-0c16-4531-be5a-e4e0262a5e52-installation-pull-secrets\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.605832 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.605767 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c8dd\" (UniqueName: \"kubernetes.io/projected/095a4938-5652-428e-9f3f-d766898b0bab-kube-api-access-2c8dd\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.605898 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.605825 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/37210912-0c16-4531-be5a-e4e0262a5e52-registry-tls\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.605898 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.605860 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/095a4938-5652-428e-9f3f-d766898b0bab-crio-socket\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.605982 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.605893 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/37210912-0c16-4531-be5a-e4e0262a5e52-image-registry-private-configuration\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.605982 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.605919 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/37210912-0c16-4531-be5a-e4e0262a5e52-registry-certificates\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.605982 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.605945 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlf98\" (UniqueName: \"kubernetes.io/projected/37210912-0c16-4531-be5a-e4e0262a5e52-kube-api-access-zlf98\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.606134 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.606017 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/095a4938-5652-428e-9f3f-d766898b0bab-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.606134 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.606063 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37210912-0c16-4531-be5a-e4e0262a5e52-trusted-ca\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.642945 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.642919 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-h8kbj"] Apr 21 10:04:43.669953 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.669931 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h8kbj"] Apr 21 10:04:43.670055 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.669973 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h8kbj" Apr 21 10:04:43.673283 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.672699 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 10:04:43.673283 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.672811 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 10:04:43.673283 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.672915 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 10:04:43.675311 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.673633 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sjbvv\"" Apr 21 10:04:43.706840 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.706819 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/095a4938-5652-428e-9f3f-d766898b0bab-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.706919 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.706851 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d6b88a-d380-4539-b153-560938088617-metrics-tls\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.706919 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.706878 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37210912-0c16-4531-be5a-e4e0262a5e52-trusted-ca\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.706994 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.706939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/095a4938-5652-428e-9f3f-d766898b0bab-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.706994 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.706972 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37210912-0c16-4531-be5a-e4e0262a5e52-bound-sa-token\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.706994 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.706991 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/37210912-0c16-4531-be5a-e4e0262a5e52-ca-trust-extracted\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.707175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707016 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/095a4938-5652-428e-9f3f-d766898b0bab-data-volume\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.707175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707045 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6d6b88a-d380-4539-b153-560938088617-tmp-dir\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.707175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707070 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqkn9\" (UniqueName: \"kubernetes.io/projected/f6d6b88a-d380-4539-b153-560938088617-kube-api-access-tqkn9\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.707175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707101 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/37210912-0c16-4531-be5a-e4e0262a5e52-installation-pull-secrets\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.707175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707165 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6d6b88a-d380-4539-b153-560938088617-config-volume\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.707408 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707193 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk9fz\" (UniqueName: \"kubernetes.io/projected/e89bee18-57f7-4cb6-9183-9ad08b859350-kube-api-access-sk9fz\") pod \"ingress-canary-h8kbj\" (UID: \"e89bee18-57f7-4cb6-9183-9ad08b859350\") " pod="openshift-ingress-canary/ingress-canary-h8kbj" Apr 21 10:04:43.707408 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707226 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c8dd\" (UniqueName: \"kubernetes.io/projected/095a4938-5652-428e-9f3f-d766898b0bab-kube-api-access-2c8dd\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.707408 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707258 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/37210912-0c16-4531-be5a-e4e0262a5e52-registry-tls\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.707408 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707283 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/095a4938-5652-428e-9f3f-d766898b0bab-crio-socket\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.707408 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707300 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/37210912-0c16-4531-be5a-e4e0262a5e52-image-registry-private-configuration\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.707408 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707316 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/37210912-0c16-4531-be5a-e4e0262a5e52-registry-certificates\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.707408 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707331 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlf98\" (UniqueName: \"kubernetes.io/projected/37210912-0c16-4531-be5a-e4e0262a5e52-kube-api-access-zlf98\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.707408 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707354 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89bee18-57f7-4cb6-9183-9ad08b859350-cert\") pod \"ingress-canary-h8kbj\" (UID: \"e89bee18-57f7-4cb6-9183-9ad08b859350\") " pod="openshift-ingress-canary/ingress-canary-h8kbj" Apr 21 10:04:43.707760 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707415 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/095a4938-5652-428e-9f3f-d766898b0bab-data-volume\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.707760 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707603 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/095a4938-5652-428e-9f3f-d766898b0bab-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.707760 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.707603 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/095a4938-5652-428e-9f3f-d766898b0bab-crio-socket\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.708186 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.708168 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/37210912-0c16-4531-be5a-e4e0262a5e52-registry-certificates\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.708430 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.708409 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37210912-0c16-4531-be5a-e4e0262a5e52-trusted-ca\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.710943 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.710917 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/095a4938-5652-428e-9f3f-d766898b0bab-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.711043 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.711009 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/37210912-0c16-4531-be5a-e4e0262a5e52-registry-tls\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.711101 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.711013 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/37210912-0c16-4531-be5a-e4e0262a5e52-installation-pull-secrets\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.711101 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.711046 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/37210912-0c16-4531-be5a-e4e0262a5e52-image-registry-private-configuration\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.717742 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.717719 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/37210912-0c16-4531-be5a-e4e0262a5e52-ca-trust-extracted\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.719413 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.719391 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlf98\" (UniqueName: \"kubernetes.io/projected/37210912-0c16-4531-be5a-e4e0262a5e52-kube-api-access-zlf98\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.719556 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.719540 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37210912-0c16-4531-be5a-e4e0262a5e52-bound-sa-token\") pod \"image-registry-7776896ff5-2q4gp\" (UID: \"37210912-0c16-4531-be5a-e4e0262a5e52\") " pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.719903 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.719888 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c8dd\" (UniqueName: \"kubernetes.io/projected/095a4938-5652-428e-9f3f-d766898b0bab-kube-api-access-2c8dd\") pod \"insights-runtime-extractor-x974r\" (UID: \"095a4938-5652-428e-9f3f-d766898b0bab\") " pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.807828 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.807807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6d6b88a-d380-4539-b153-560938088617-tmp-dir\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.807959 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.807835 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqkn9\" (UniqueName: \"kubernetes.io/projected/f6d6b88a-d380-4539-b153-560938088617-kube-api-access-tqkn9\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.807959 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.807874 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6d6b88a-d380-4539-b153-560938088617-config-volume\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.807959 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.807890 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sk9fz\" (UniqueName: \"kubernetes.io/projected/e89bee18-57f7-4cb6-9183-9ad08b859350-kube-api-access-sk9fz\") pod \"ingress-canary-h8kbj\" (UID: \"e89bee18-57f7-4cb6-9183-9ad08b859350\") " pod="openshift-ingress-canary/ingress-canary-h8kbj" Apr 21 10:04:43.807959 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.807945 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89bee18-57f7-4cb6-9183-9ad08b859350-cert\") pod \"ingress-canary-h8kbj\" (UID: \"e89bee18-57f7-4cb6-9183-9ad08b859350\") " pod="openshift-ingress-canary/ingress-canary-h8kbj" Apr 21 10:04:43.808192 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.807968 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d6b88a-d380-4539-b153-560938088617-metrics-tls\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.808192 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.808139 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6d6b88a-d380-4539-b153-560938088617-tmp-dir\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.808455 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.808434 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6d6b88a-d380-4539-b153-560938088617-config-volume\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.810612 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.810591 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d6b88a-d380-4539-b153-560938088617-metrics-tls\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.810757 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.810739 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89bee18-57f7-4cb6-9183-9ad08b859350-cert\") pod \"ingress-canary-h8kbj\" (UID: \"e89bee18-57f7-4cb6-9183-9ad08b859350\") " pod="openshift-ingress-canary/ingress-canary-h8kbj" Apr 21 10:04:43.827022 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.827001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk9fz\" (UniqueName: \"kubernetes.io/projected/e89bee18-57f7-4cb6-9183-9ad08b859350-kube-api-access-sk9fz\") pod \"ingress-canary-h8kbj\" (UID: \"e89bee18-57f7-4cb6-9183-9ad08b859350\") " pod="openshift-ingress-canary/ingress-canary-h8kbj" Apr 21 10:04:43.830337 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.830318 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqkn9\" (UniqueName: \"kubernetes.io/projected/f6d6b88a-d380-4539-b153-560938088617-kube-api-access-tqkn9\") pod \"dns-default-lzplb\" (UID: \"f6d6b88a-d380-4539-b153-560938088617\") " pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.860323 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.860303 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:43.882994 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.882975 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x974r" Apr 21 10:04:43.907842 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.907813 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:43.982379 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:43.981632 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h8kbj" Apr 21 10:04:44.035993 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.035945 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x974r"] Apr 21 10:04:44.041278 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:44.041244 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod095a4938_5652_428e_9f3f_d766898b0bab.slice/crio-8755c3adb82a282104b7b3670d3eaf300199707b259dc5e20fa69770374adf15 WatchSource:0}: Error finding container 8755c3adb82a282104b7b3670d3eaf300199707b259dc5e20fa69770374adf15: Status 404 returned error can't find the container with id 8755c3adb82a282104b7b3670d3eaf300199707b259dc5e20fa69770374adf15 Apr 21 10:04:44.108663 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.108612 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h8kbj"] Apr 21 10:04:44.111937 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:44.111914 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode89bee18_57f7_4cb6_9183_9ad08b859350.slice/crio-d97590c587e004b4d82c0c095de9709fade32fe82fec75210149e27bf61126d2 WatchSource:0}: Error finding container d97590c587e004b4d82c0c095de9709fade32fe82fec75210149e27bf61126d2: Status 404 returned error can't find the container with id d97590c587e004b4d82c0c095de9709fade32fe82fec75210149e27bf61126d2 Apr 21 10:04:44.249863 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.249833 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lzplb"] Apr 21 10:04:44.252971 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.252949 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:44.253164 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.253144 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:44.253763 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:44.253742 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6d6b88a_d380_4539_b153_560938088617.slice/crio-10855f537b2722cc6819c1a1242a861dec4f6ae010ffa871240362c3d5ae24ab WatchSource:0}: Error finding container 10855f537b2722cc6819c1a1242a861dec4f6ae010ffa871240362c3d5ae24ab: Status 404 returned error can't find the container with id 10855f537b2722cc6819c1a1242a861dec4f6ae010ffa871240362c3d5ae24ab Apr 21 10:04:44.255384 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.255358 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:04:44.255482 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.255441 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:04:44.255482 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.255441 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qrq7w\"" Apr 21 10:04:44.256230 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.256210 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:04:44.256455 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.256436 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fjqp7\"" Apr 21 10:04:44.256725 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.256692 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7776896ff5-2q4gp"] Apr 21 10:04:44.257013 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:44.256996 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37210912_0c16_4531_be5a_e4e0262a5e52.slice/crio-23398dbd14911a0cb2ff26aac1e87b19c9ee14810b6da50962c8b443d0220978 WatchSource:0}: Error finding container 23398dbd14911a0cb2ff26aac1e87b19c9ee14810b6da50962c8b443d0220978: Status 404 returned error can't find the container with id 23398dbd14911a0cb2ff26aac1e87b19c9ee14810b6da50962c8b443d0220978 Apr 21 10:04:44.496087 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.496044 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h8kbj" event={"ID":"e89bee18-57f7-4cb6-9183-9ad08b859350","Type":"ContainerStarted","Data":"d97590c587e004b4d82c0c095de9709fade32fe82fec75210149e27bf61126d2"} Apr 21 10:04:44.497338 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.497306 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" event={"ID":"37210912-0c16-4531-be5a-e4e0262a5e52","Type":"ContainerStarted","Data":"08192566e9890c5ae24a8560351e186bc632d124c670a0bba1c6dc23d170fe8a"} Apr 21 10:04:44.497338 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.497340 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" event={"ID":"37210912-0c16-4531-be5a-e4e0262a5e52","Type":"ContainerStarted","Data":"23398dbd14911a0cb2ff26aac1e87b19c9ee14810b6da50962c8b443d0220978"} Apr 21 10:04:44.497520 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.497488 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:04:44.498402 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.498380 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lzplb" event={"ID":"f6d6b88a-d380-4539-b153-560938088617","Type":"ContainerStarted","Data":"10855f537b2722cc6819c1a1242a861dec4f6ae010ffa871240362c3d5ae24ab"} Apr 21 10:04:44.499610 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.499590 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x974r" event={"ID":"095a4938-5652-428e-9f3f-d766898b0bab","Type":"ContainerStarted","Data":"424b69886b497992dc232a3ce342519f6b99f50a4455a054a0a4a21266728bdb"} Apr 21 10:04:44.499610 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.499616 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x974r" event={"ID":"095a4938-5652-428e-9f3f-d766898b0bab","Type":"ContainerStarted","Data":"8755c3adb82a282104b7b3670d3eaf300199707b259dc5e20fa69770374adf15"} Apr 21 10:04:44.522891 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:44.522820 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" podStartSLOduration=1.522805359 podStartE2EDuration="1.522805359s" podCreationTimestamp="2026-04-21 10:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:04:44.522396659 +0000 UTC m=+52.866846584" watchObservedRunningTime="2026-04-21 10:04:44.522805359 +0000 UTC m=+52.867255242" Apr 21 10:04:45.252364 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:45.252291 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:45.254774 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:45.254752 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 10:04:45.503528 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:45.503424 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x974r" event={"ID":"095a4938-5652-428e-9f3f-d766898b0bab","Type":"ContainerStarted","Data":"a59fbf4b0c99af1638568545bb659c913d75e5f594cc97706edc8c54d037713c"} Apr 21 10:04:47.510287 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:47.510252 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lzplb" event={"ID":"f6d6b88a-d380-4539-b153-560938088617","Type":"ContainerStarted","Data":"78fc1e1cc37cc099eeb5ffc26624fd481f23eb57bec90a2a5cbdf204aa4d67e4"} Apr 21 10:04:47.511507 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:47.511470 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h8kbj" event={"ID":"e89bee18-57f7-4cb6-9183-9ad08b859350","Type":"ContainerStarted","Data":"afe3dc0116bbc1814e1372b90f7ef16ee6fc0ffe222122b06f08fb4c0aec1913"} Apr 21 10:04:47.530942 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:47.530520 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-h8kbj" podStartSLOduration=1.347305627 podStartE2EDuration="4.53050329s" podCreationTimestamp="2026-04-21 10:04:43 +0000 UTC" firstStartedPulling="2026-04-21 10:04:44.113667017 +0000 UTC m=+52.458116880" lastFinishedPulling="2026-04-21 10:04:47.296864674 +0000 UTC m=+55.641314543" observedRunningTime="2026-04-21 10:04:47.529866998 +0000 UTC m=+55.874316883" watchObservedRunningTime="2026-04-21 10:04:47.53050329 +0000 UTC m=+55.874953175" Apr 21 10:04:48.515146 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:48.515095 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lzplb" event={"ID":"f6d6b88a-d380-4539-b153-560938088617","Type":"ContainerStarted","Data":"948f41e262f79dbf555d962f94fdda6bd22a51ce31e4d086cef53077fc34d38e"} Apr 21 10:04:48.515597 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:48.515320 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:48.519161 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:48.519139 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x974r" event={"ID":"095a4938-5652-428e-9f3f-d766898b0bab","Type":"ContainerStarted","Data":"cbf887b1a5828ac89374bd09efbb24a26d8b7c20b9d06d0e41b3278610a77502"} Apr 21 10:04:48.531419 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:48.531382 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lzplb" podStartSLOduration=2.486754834 podStartE2EDuration="5.531369969s" podCreationTimestamp="2026-04-21 10:04:43 +0000 UTC" firstStartedPulling="2026-04-21 10:04:44.256212369 +0000 UTC m=+52.600662244" lastFinishedPulling="2026-04-21 10:04:47.300827516 +0000 UTC m=+55.645277379" observedRunningTime="2026-04-21 10:04:48.530238079 +0000 UTC m=+56.874687965" watchObservedRunningTime="2026-04-21 10:04:48.531369969 +0000 UTC m=+56.875819854" Apr 21 10:04:48.547099 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:48.547066 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-x974r" podStartSLOduration=2.024099808 podStartE2EDuration="5.547055129s" podCreationTimestamp="2026-04-21 10:04:43 +0000 UTC" firstStartedPulling="2026-04-21 10:04:44.150729935 +0000 UTC m=+52.495179798" lastFinishedPulling="2026-04-21 10:04:47.673685253 +0000 UTC m=+56.018135119" observedRunningTime="2026-04-21 10:04:48.546325283 +0000 UTC m=+56.890775168" watchObservedRunningTime="2026-04-21 10:04:48.547055129 +0000 UTC m=+56.891505005" Apr 21 10:04:50.457876 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:50.457849 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l2qr9" Apr 21 10:04:51.263542 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.263509 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm"] Apr 21 10:04:51.299913 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.299888 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2hxkg"] Apr 21 10:04:51.300084 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.300066 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.303956 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.303932 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 10:04:51.303956 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.303938 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 10:04:51.304273 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.304258 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 10:04:51.304329 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.304258 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 10:04:51.305039 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.304966 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-hwpt9\"" Apr 21 10:04:51.305039 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.305017 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 10:04:51.326466 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.326442 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm"] Apr 21 10:04:51.326557 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.326545 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.328777 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.328758 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 10:04:51.328894 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.328806 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 10:04:51.328894 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.328823 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 10:04:51.329081 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.329065 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-48j4v\"" Apr 21 10:04:51.363222 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363198 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.363344 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363235 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrh4q\" (UniqueName: \"kubernetes.io/projected/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-kube-api-access-qrh4q\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.363344 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363264 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-accelerators-collector-config\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.363344 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363332 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-root\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.363481 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363358 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-textfile\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.363481 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363413 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kqwg\" (UniqueName: \"kubernetes.io/projected/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-kube-api-access-5kqwg\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.363481 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363443 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-wtmp\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.363481 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.363596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363487 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.363596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363515 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-sys\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.363596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363552 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-metrics-client-ca\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.363596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363588 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.363712 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.363610 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-tls\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.463940 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.463917 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrh4q\" (UniqueName: \"kubernetes.io/projected/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-kube-api-access-qrh4q\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.464271 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.463946 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-accelerators-collector-config\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.464271 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.463977 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-root\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.464271 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464023 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-textfile\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.464271 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464056 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kqwg\" (UniqueName: \"kubernetes.io/projected/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-kube-api-access-5kqwg\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.464271 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-root\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.464271 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464180 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-wtmp\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.464271 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464233 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.464271 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464264 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.464618 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464309 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-sys\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.464618 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464337 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-wtmp\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.464618 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464341 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-metrics-client-ca\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.464618 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464400 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.464618 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464425 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-sys\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.464618 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464431 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-tls\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.464618 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.464472 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.466544 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.466526 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.466748 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.466730 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.471977 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.471949 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kqwg\" (UniqueName: \"kubernetes.io/projected/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-kube-api-access-5kqwg\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.474658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.474613 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-textfile\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.474826 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.474804 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-accelerators-collector-config\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.474903 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.474813 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-metrics-client-ca\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.474962 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.474910 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a6954c-23c1-4d88-b436-1b9885f0dfc3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-92xmm\" (UID: \"d1a6954c-23c1-4d88-b436-1b9885f0dfc3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.476593 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.476572 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-tls\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.477008 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.476994 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrh4q\" (UniqueName: \"kubernetes.io/projected/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-kube-api-access-qrh4q\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.477059 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.477015 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/68aa0fc3-11c0-423b-84c6-5f7b2c07e131-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2hxkg\" (UID: \"68aa0fc3-11c0-423b-84c6-5f7b2c07e131\") " pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.608755 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.608727 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" Apr 21 10:04:51.634410 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.634382 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2hxkg" Apr 21 10:04:51.641465 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:51.641435 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68aa0fc3_11c0_423b_84c6_5f7b2c07e131.slice/crio-2a8b0be55348100362690ce8c4c7f63311d29289e5e2d26b28f91529d5a6ea41 WatchSource:0}: Error finding container 2a8b0be55348100362690ce8c4c7f63311d29289e5e2d26b28f91529d5a6ea41: Status 404 returned error can't find the container with id 2a8b0be55348100362690ce8c4c7f63311d29289e5e2d26b28f91529d5a6ea41 Apr 21 10:04:51.724329 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:51.724304 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm"] Apr 21 10:04:51.727242 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:51.727216 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a6954c_23c1_4d88_b436_1b9885f0dfc3.slice/crio-d918c115811a6e4107654b974619f4dfb1897f700e7bb5e2d5c856202299f627 WatchSource:0}: Error finding container d918c115811a6e4107654b974619f4dfb1897f700e7bb5e2d5c856202299f627: Status 404 returned error can't find the container with id d918c115811a6e4107654b974619f4dfb1897f700e7bb5e2d5c856202299f627 Apr 21 10:04:52.345733 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.345698 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:04:52.350215 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.350193 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.352773 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.352746 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 10:04:52.352773 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.352762 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 10:04:52.352939 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.352781 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 10:04:52.353370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.352996 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 10:04:52.353370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.353075 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 10:04:52.353370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.353076 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 10:04:52.353370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.353131 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-zslxs\"" Apr 21 10:04:52.353370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.353154 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 10:04:52.353370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.353223 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 10:04:52.353370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.353282 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 10:04:52.361472 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.361451 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:04:52.472016 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.471988 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-config-volume\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472409 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472051 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472409 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472090 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472409 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472153 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472409 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472189 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472409 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472217 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472409 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472287 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-config-out\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472409 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472327 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472409 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472379 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472436 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472463 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472494 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7pwc\" (UniqueName: \"kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-kube-api-access-v7pwc\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.472731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.472531 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-web-config\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.532242 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.532199 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" event={"ID":"d1a6954c-23c1-4d88-b436-1b9885f0dfc3","Type":"ContainerStarted","Data":"722b421c1dee10383a0f47b91fe1f5763d381d2624b1fbab3af169a312719fb9"} Apr 21 10:04:52.532242 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.532246 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" event={"ID":"d1a6954c-23c1-4d88-b436-1b9885f0dfc3","Type":"ContainerStarted","Data":"a4a4a0eb32c5ccda8ffb5f0bba4a7c128f691c87b59c7ba65b4c0e6c33a5f06b"} Apr 21 10:04:52.532470 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.532256 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" event={"ID":"d1a6954c-23c1-4d88-b436-1b9885f0dfc3","Type":"ContainerStarted","Data":"d918c115811a6e4107654b974619f4dfb1897f700e7bb5e2d5c856202299f627"} Apr 21 10:04:52.534125 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.534074 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hxkg" event={"ID":"68aa0fc3-11c0-423b-84c6-5f7b2c07e131","Type":"ContainerStarted","Data":"7f0f8b65697187b00fc0ee80018a60f6efc06353a40eac7e1256615ac8762c93"} Apr 21 10:04:52.534244 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.534138 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hxkg" event={"ID":"68aa0fc3-11c0-423b-84c6-5f7b2c07e131","Type":"ContainerStarted","Data":"2a8b0be55348100362690ce8c4c7f63311d29289e5e2d26b28f91529d5a6ea41"} Apr 21 10:04:52.573134 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573084 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-config-volume\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573309 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573162 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573309 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573190 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573309 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573217 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573495 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573462 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573569 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573513 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573569 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573550 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-config-out\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573672 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573583 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573672 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573992 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573709 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573992 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573775 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573992 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7pwc\" (UniqueName: \"kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-kube-api-access-v7pwc\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.573992 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.573888 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-web-config\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.574777 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.574340 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.574777 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.574678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.575695 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.575448 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.576205 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.576177 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-config-out\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.577598 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.577535 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.577817 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.577799 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-web-config\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.578242 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.578219 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.578530 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.578479 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.578633 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.578548 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.578633 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.578567 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-config-volume\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.579449 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.579417 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.579522 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.579495 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.584560 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.584537 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7pwc\" (UniqueName: \"kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-kube-api-access-v7pwc\") pod \"alertmanager-main-0\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.662561 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.662481 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:04:52.796087 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:52.796054 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:04:52.912788 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:52.912710 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d96bb4c_72ad_4b48_b738_13a156de3777.slice/crio-631bca38ef677bef17362b88f069b2ad98321a72152e8e5163d621cd59e015a2 WatchSource:0}: Error finding container 631bca38ef677bef17362b88f069b2ad98321a72152e8e5163d621cd59e015a2: Status 404 returned error can't find the container with id 631bca38ef677bef17362b88f069b2ad98321a72152e8e5163d621cd59e015a2 Apr 21 10:04:53.537865 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:53.537835 2567 generic.go:358] "Generic (PLEG): container finished" podID="68aa0fc3-11c0-423b-84c6-5f7b2c07e131" containerID="7f0f8b65697187b00fc0ee80018a60f6efc06353a40eac7e1256615ac8762c93" exitCode=0 Apr 21 10:04:53.538304 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:53.537897 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hxkg" event={"ID":"68aa0fc3-11c0-423b-84c6-5f7b2c07e131","Type":"ContainerDied","Data":"7f0f8b65697187b00fc0ee80018a60f6efc06353a40eac7e1256615ac8762c93"} Apr 21 10:04:53.538972 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:53.538950 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerStarted","Data":"631bca38ef677bef17362b88f069b2ad98321a72152e8e5163d621cd59e015a2"} Apr 21 10:04:53.540601 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:53.540580 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" event={"ID":"d1a6954c-23c1-4d88-b436-1b9885f0dfc3","Type":"ContainerStarted","Data":"bb63c3c726395fd576940e302157b96758393a09cf5c3cf8c91403542f8245c7"} Apr 21 10:04:53.574327 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:53.574283 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-92xmm" podStartSLOduration=1.5607903109999999 podStartE2EDuration="2.574270191s" podCreationTimestamp="2026-04-21 10:04:51 +0000 UTC" firstStartedPulling="2026-04-21 10:04:51.942348356 +0000 UTC m=+60.286798233" lastFinishedPulling="2026-04-21 10:04:52.955828246 +0000 UTC m=+61.300278113" observedRunningTime="2026-04-21 10:04:53.573624163 +0000 UTC m=+61.918074050" watchObservedRunningTime="2026-04-21 10:04:53.574270191 +0000 UTC m=+61.918720076" Apr 21 10:04:54.549275 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:54.549235 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hxkg" event={"ID":"68aa0fc3-11c0-423b-84c6-5f7b2c07e131","Type":"ContainerStarted","Data":"171bb579a055396c72e79fcddc659c08831aafce4e373872cd478ebb5ed04f81"} Apr 21 10:04:54.549275 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:54.549277 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2hxkg" event={"ID":"68aa0fc3-11c0-423b-84c6-5f7b2c07e131","Type":"ContainerStarted","Data":"b8ca1bbcbc8fc02895677dab6c4a5693121a26b424e2961405c6d0ebb7ba6e15"} Apr 21 10:04:54.550562 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:54.550540 2567 generic.go:358] "Generic (PLEG): container finished" podID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerID="49e91b6eded9a17bd540233a7fc7c43a52dc4d41d20ff21699de17e9c5dbe213" exitCode=0 Apr 21 10:04:54.550651 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:54.550627 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerDied","Data":"49e91b6eded9a17bd540233a7fc7c43a52dc4d41d20ff21699de17e9c5dbe213"} Apr 21 10:04:54.569429 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:54.569332 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2hxkg" podStartSLOduration=2.803403194 podStartE2EDuration="3.569318021s" podCreationTimestamp="2026-04-21 10:04:51 +0000 UTC" firstStartedPulling="2026-04-21 10:04:51.643146722 +0000 UTC m=+59.987596599" lastFinishedPulling="2026-04-21 10:04:52.409061545 +0000 UTC m=+60.753511426" observedRunningTime="2026-04-21 10:04:54.568320299 +0000 UTC m=+62.912770183" watchObservedRunningTime="2026-04-21 10:04:54.569318021 +0000 UTC m=+62.913767910" Apr 21 10:04:55.581735 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.581705 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5747f67664-2t59r"] Apr 21 10:04:55.584288 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.584259 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.587731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.587628 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9qoemoi2im3rd\"" Apr 21 10:04:55.587731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.587652 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 10:04:55.587731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.587662 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 10:04:55.587731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.587707 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 10:04:55.587731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.587662 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 10:04:55.588078 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.588061 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-cltd7\"" Apr 21 10:04:55.593328 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.593310 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5747f67664-2t59r"] Apr 21 10:04:55.700825 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.700797 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60918ad2-789d-42c3-ae43-a03117a42abc-client-ca-bundle\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.700987 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.700845 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/60918ad2-789d-42c3-ae43-a03117a42abc-secret-metrics-server-client-certs\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.700987 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.700913 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/60918ad2-789d-42c3-ae43-a03117a42abc-audit-log\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.701078 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.701015 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn5vl\" (UniqueName: \"kubernetes.io/projected/60918ad2-789d-42c3-ae43-a03117a42abc-kube-api-access-nn5vl\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.701078 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.701058 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/60918ad2-789d-42c3-ae43-a03117a42abc-secret-metrics-server-tls\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.701200 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.701161 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60918ad2-789d-42c3-ae43-a03117a42abc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.701249 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.701195 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/60918ad2-789d-42c3-ae43-a03117a42abc-metrics-server-audit-profiles\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.802347 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.802305 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/60918ad2-789d-42c3-ae43-a03117a42abc-audit-log\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.802521 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.802373 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn5vl\" (UniqueName: \"kubernetes.io/projected/60918ad2-789d-42c3-ae43-a03117a42abc-kube-api-access-nn5vl\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.802521 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.802402 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/60918ad2-789d-42c3-ae43-a03117a42abc-secret-metrics-server-tls\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.802521 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.802454 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60918ad2-789d-42c3-ae43-a03117a42abc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.802521 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.802484 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/60918ad2-789d-42c3-ae43-a03117a42abc-metrics-server-audit-profiles\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.802733 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.802531 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60918ad2-789d-42c3-ae43-a03117a42abc-client-ca-bundle\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.802733 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.802575 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/60918ad2-789d-42c3-ae43-a03117a42abc-secret-metrics-server-client-certs\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.802733 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.802717 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/60918ad2-789d-42c3-ae43-a03117a42abc-audit-log\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.803503 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.803477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60918ad2-789d-42c3-ae43-a03117a42abc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.804139 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.804098 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/60918ad2-789d-42c3-ae43-a03117a42abc-metrics-server-audit-profiles\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.805182 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.805139 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/60918ad2-789d-42c3-ae43-a03117a42abc-secret-metrics-server-client-certs\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.805328 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.805308 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60918ad2-789d-42c3-ae43-a03117a42abc-client-ca-bundle\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.805382 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.805317 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/60918ad2-789d-42c3-ae43-a03117a42abc-secret-metrics-server-tls\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.810332 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.810311 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn5vl\" (UniqueName: \"kubernetes.io/projected/60918ad2-789d-42c3-ae43-a03117a42abc-kube-api-access-nn5vl\") pod \"metrics-server-5747f67664-2t59r\" (UID: \"60918ad2-789d-42c3-ae43-a03117a42abc\") " pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:55.895241 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:55.895160 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:04:56.048764 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.048740 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5747f67664-2t59r"] Apr 21 10:04:56.051782 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:56.051753 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60918ad2_789d_42c3_ae43_a03117a42abc.slice/crio-2ad0bc22be0176352ae16517f0991c709e4ffd5a7de70ba8c8c2aeb09bcfe0a0 WatchSource:0}: Error finding container 2ad0bc22be0176352ae16517f0991c709e4ffd5a7de70ba8c8c2aeb09bcfe0a0: Status 404 returned error can't find the container with id 2ad0bc22be0176352ae16517f0991c709e4ffd5a7de70ba8c8c2aeb09bcfe0a0 Apr 21 10:04:56.057768 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.057748 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv"] Apr 21 10:04:56.060803 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.060781 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv" Apr 21 10:04:56.063284 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.063260 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 10:04:56.063400 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.063315 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-gwlfm\"" Apr 21 10:04:56.067955 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.067919 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv"] Apr 21 10:04:56.105467 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.105445 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f2b092e0-5084-4ac8-a541-ebb66bf667a2-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7fvv\" (UID: \"f2b092e0-5084-4ac8-a541-ebb66bf667a2\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv" Apr 21 10:04:56.206585 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.206563 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f2b092e0-5084-4ac8-a541-ebb66bf667a2-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7fvv\" (UID: \"f2b092e0-5084-4ac8-a541-ebb66bf667a2\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv" Apr 21 10:04:56.206721 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:56.206704 2567 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 10:04:56.206789 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:04:56.206774 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2b092e0-5084-4ac8-a541-ebb66bf667a2-monitoring-plugin-cert podName:f2b092e0-5084-4ac8-a541-ebb66bf667a2 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:56.706751483 +0000 UTC m=+65.051201351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/f2b092e0-5084-4ac8-a541-ebb66bf667a2-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-j7fvv" (UID: "f2b092e0-5084-4ac8-a541-ebb66bf667a2") : secret "monitoring-plugin-cert" not found Apr 21 10:04:56.458133 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.458038 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-75fdf7498f-rzddk"] Apr 21 10:04:56.461278 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.461264 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.464490 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.464464 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 10:04:56.464598 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.464495 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 10:04:56.464598 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.464555 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 10:04:56.464750 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.464736 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 10:04:56.466385 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.465359 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-rqd46\"" Apr 21 10:04:56.466874 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.466685 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 10:04:56.472994 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.472961 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 10:04:56.474464 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.474445 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75fdf7498f-rzddk"] Apr 21 10:04:56.509074 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.509043 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.509174 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.509088 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-secret-telemeter-client\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.509234 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.509200 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-metrics-client-ca\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.509327 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.509275 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-federate-client-tls\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.509379 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.509336 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-serving-certs-ca-bundle\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.509442 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.509383 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-telemeter-client-tls\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.509534 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.509439 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm7nm\" (UniqueName: \"kubernetes.io/projected/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-kube-api-access-fm7nm\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.509534 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.509482 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.560880 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.560849 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerStarted","Data":"8c8602f9a3b7a6eed6aca56d7c48aef67c87f27261a9c847556b768116d8f1e2"} Apr 21 10:04:56.560981 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.560887 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerStarted","Data":"760355aa7c2d9c19e3efa331d362b1f9f314d8cc39d1f8a08851415bddb39c47"} Apr 21 10:04:56.560981 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.560901 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerStarted","Data":"ca1b578c57ff9886f0597b5062de5f2c45b0484aaca522dff0d3d0f2809507a2"} Apr 21 10:04:56.560981 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.560914 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerStarted","Data":"27e85b104060e0271db6297fc4f35cefa5bcd435e1046e9018d5067ff5c13c2a"} Apr 21 10:04:56.560981 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.560927 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerStarted","Data":"37246dcfc446cbd2bc65c0996f11a2a626ef4643bf4c64318dcd88d279c6d858"} Apr 21 10:04:56.562010 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.561983 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5747f67664-2t59r" event={"ID":"60918ad2-789d-42c3-ae43-a03117a42abc","Type":"ContainerStarted","Data":"2ad0bc22be0176352ae16517f0991c709e4ffd5a7de70ba8c8c2aeb09bcfe0a0"} Apr 21 10:04:56.609973 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.609949 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-serving-certs-ca-bundle\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.610447 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.609978 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-telemeter-client-tls\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.610447 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.609996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fm7nm\" (UniqueName: \"kubernetes.io/projected/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-kube-api-access-fm7nm\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.610447 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.610025 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.610447 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.610052 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.610447 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.610083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-secret-telemeter-client\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.610447 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.610272 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-metrics-client-ca\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.610447 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.610386 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-federate-client-tls\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.610793 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.610723 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-serving-certs-ca-bundle\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.611029 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.610958 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-metrics-client-ca\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.611149 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.611046 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.613273 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.613252 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-federate-client-tls\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.613512 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.613489 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-secret-telemeter-client\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.613512 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.613498 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.613667 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.613562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-telemeter-client-tls\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.619364 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.619342 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm7nm\" (UniqueName: \"kubernetes.io/projected/cffa9bea-51ed-4ef6-a644-9598dc5dcc73-kube-api-access-fm7nm\") pod \"telemeter-client-75fdf7498f-rzddk\" (UID: \"cffa9bea-51ed-4ef6-a644-9598dc5dcc73\") " pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.711213 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.711076 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f2b092e0-5084-4ac8-a541-ebb66bf667a2-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7fvv\" (UID: \"f2b092e0-5084-4ac8-a541-ebb66bf667a2\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv" Apr 21 10:04:56.713702 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.713676 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f2b092e0-5084-4ac8-a541-ebb66bf667a2-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-j7fvv\" (UID: \"f2b092e0-5084-4ac8-a541-ebb66bf667a2\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv" Apr 21 10:04:56.772531 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.772500 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" Apr 21 10:04:56.971034 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:56.970897 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv" Apr 21 10:04:57.023593 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.023564 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75fdf7498f-rzddk"] Apr 21 10:04:57.125251 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.125220 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv"] Apr 21 10:04:57.415685 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:57.415645 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2b092e0_5084_4ac8_a541_ebb66bf667a2.slice/crio-eecc9e28a0721588b645a3ac7f3b46095cb9ba4dd60a992dd214c9f8182663f8 WatchSource:0}: Error finding container eecc9e28a0721588b645a3ac7f3b46095cb9ba4dd60a992dd214c9f8182663f8: Status 404 returned error can't find the container with id eecc9e28a0721588b645a3ac7f3b46095cb9ba4dd60a992dd214c9f8182663f8 Apr 21 10:04:57.567328 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.567287 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerStarted","Data":"99cb2a804ec16d6bf846841a6b8ce5e1f9772da79cee9b11448e905e32a54c9d"} Apr 21 10:04:57.568683 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.568649 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5747f67664-2t59r" event={"ID":"60918ad2-789d-42c3-ae43-a03117a42abc","Type":"ContainerStarted","Data":"763a95aecce5c9812ce99d54e45259d97ab12eb2eb3c8b2db5748bffc5e5b25f"} Apr 21 10:04:57.569710 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.569681 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv" event={"ID":"f2b092e0-5084-4ac8-a541-ebb66bf667a2","Type":"ContainerStarted","Data":"eecc9e28a0721588b645a3ac7f3b46095cb9ba4dd60a992dd214c9f8182663f8"} Apr 21 10:04:57.570674 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.570654 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" event={"ID":"cffa9bea-51ed-4ef6-a644-9598dc5dcc73","Type":"ContainerStarted","Data":"1956a7df3d43c1b5de5466c9f39b926c0cf2d149fb3459b8085473cf29857986"} Apr 21 10:04:57.606989 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.606925 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:04:57.611029 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.611011 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.614932 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.614912 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 10:04:57.615025 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.614984 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 10:04:57.616012 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.615996 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 10:04:57.616200 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.616029 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ma6a69sqc1ei\"" Apr 21 10:04:57.616328 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.616313 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 10:04:57.617096 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.617079 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 10:04:57.617364 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.617345 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 10:04:57.617503 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.617485 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 10:04:57.617584 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.617520 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 10:04:57.617724 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.617699 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 10:04:57.617819 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.617737 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wnq4m\"" Apr 21 10:04:57.624705 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.624686 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 10:04:57.629220 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.629180 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.6277553820000001 podStartE2EDuration="5.629165977s" podCreationTimestamp="2026-04-21 10:04:52 +0000 UTC" firstStartedPulling="2026-04-21 10:04:52.914545438 +0000 UTC m=+61.258995301" lastFinishedPulling="2026-04-21 10:04:56.915956013 +0000 UTC m=+65.260405896" observedRunningTime="2026-04-21 10:04:57.629045466 +0000 UTC m=+65.973495351" watchObservedRunningTime="2026-04-21 10:04:57.629165977 +0000 UTC m=+65.973615865" Apr 21 10:04:57.631382 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.631360 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 10:04:57.635672 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.635652 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 10:04:57.641918 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.641858 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:04:57.684027 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.683934 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5747f67664-2t59r" podStartSLOduration=1.277272188 podStartE2EDuration="2.683917679s" podCreationTimestamp="2026-04-21 10:04:55 +0000 UTC" firstStartedPulling="2026-04-21 10:04:56.053698805 +0000 UTC m=+64.398148682" lastFinishedPulling="2026-04-21 10:04:57.460344307 +0000 UTC m=+65.804794173" observedRunningTime="2026-04-21 10:04:57.683310486 +0000 UTC m=+66.027760372" watchObservedRunningTime="2026-04-21 10:04:57.683917679 +0000 UTC m=+66.028367568" Apr 21 10:04:57.719489 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719456 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.719622 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719581 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.719622 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719614 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.719705 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719643 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-web-config\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.719705 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719668 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8k89\" (UniqueName: \"kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-kube-api-access-j8k89\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.719781 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719745 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.719831 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719782 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.719831 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719812 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.719928 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719900 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.719985 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719937 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.719985 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719970 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config-out\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.720084 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.719998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.720084 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.720032 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.720197 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.720103 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.720197 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.720170 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.720268 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.720198 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.720268 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.720231 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.720268 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.720254 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.821606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821260 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.821606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821332 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.821606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821357 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.821606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821381 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-web-config\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.821606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821406 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8k89\" (UniqueName: \"kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-kube-api-access-j8k89\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.821606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821441 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.821606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821481 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.821606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821515 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.821606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821553 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.821606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821577 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.821606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821605 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config-out\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.822260 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821632 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.822260 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.822260 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821708 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.822260 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821761 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.822260 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821794 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.822260 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821821 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.822260 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.821846 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.822260 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.822018 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.822260 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.822169 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.823589 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.823562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.825934 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.825909 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.828460 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.828017 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.828460 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.828285 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-web-config\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.829158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.828821 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config-out\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.829158 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.828975 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.829873 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.829548 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.829873 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.829827 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.831089 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.830085 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.831509 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.831477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.831608 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.831518 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.832267 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.832209 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.832373 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.832358 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.832545 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.832439 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8k89\" (UniqueName: \"kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-kube-api-access-j8k89\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.832695 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.832565 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.834382 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.834345 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config\") pod \"prometheus-k8s-0\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.923005 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.922964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:57.923179 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.923150 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:04:57.925442 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.925417 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:04:57.936821 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:57.936764 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b8b33c9-4316-4863-843f-730d4490910b-metrics-certs\") pod \"network-metrics-daemon-vrs72\" (UID: \"1b8b33c9-4316-4863-843f-730d4490910b\") " pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:58.024204 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.024104 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8f4\" (UniqueName: \"kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4\") pod \"network-check-target-lf67k\" (UID: \"8cb513a2-6bf9-465c-bac3-8b87096c0e4e\") " pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:58.026892 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.026859 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:04:58.037896 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.037298 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:04:58.048383 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.048347 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j8f4\" (UniqueName: \"kubernetes.io/projected/8cb513a2-6bf9-465c-bac3-8b87096c0e4e-kube-api-access-6j8f4\") pod \"network-check-target-lf67k\" (UID: \"8cb513a2-6bf9-465c-bac3-8b87096c0e4e\") " pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:58.065516 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.065304 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qrq7w\"" Apr 21 10:04:58.070345 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.070325 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fjqp7\"" Apr 21 10:04:58.073164 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.073140 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:04:58.078269 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.078249 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vrs72" Apr 21 10:04:58.094262 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.094218 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:04:58.097940 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:58.097848 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62fcbae9_5553_48a3_93d6_a940e5ec6fff.slice/crio-091d50bd447fde9bab39d1888ae54db989012fb138ee1c1af6f78d12ccf5dd13 WatchSource:0}: Error finding container 091d50bd447fde9bab39d1888ae54db989012fb138ee1c1af6f78d12ccf5dd13: Status 404 returned error can't find the container with id 091d50bd447fde9bab39d1888ae54db989012fb138ee1c1af6f78d12ccf5dd13 Apr 21 10:04:58.232695 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.232639 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lf67k"] Apr 21 10:04:58.236784 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:58.236725 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cb513a2_6bf9_465c_bac3_8b87096c0e4e.slice/crio-e6c6fdf6da4748177b633e0c2e32271db75b9f624a89b4d3841a6776f5fa3f3f WatchSource:0}: Error finding container e6c6fdf6da4748177b633e0c2e32271db75b9f624a89b4d3841a6776f5fa3f3f: Status 404 returned error can't find the container with id e6c6fdf6da4748177b633e0c2e32271db75b9f624a89b4d3841a6776f5fa3f3f Apr 21 10:04:58.263407 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.263376 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vrs72"] Apr 21 10:04:58.294207 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:58.294166 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b8b33c9_4316_4863_843f_730d4490910b.slice/crio-1057b0e4f4b93ae1687f4fcbf577890986bd4202c179e32bf6ead643f197d620 WatchSource:0}: Error finding container 1057b0e4f4b93ae1687f4fcbf577890986bd4202c179e32bf6ead643f197d620: Status 404 returned error can't find the container with id 1057b0e4f4b93ae1687f4fcbf577890986bd4202c179e32bf6ead643f197d620 Apr 21 10:04:58.526088 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.526010 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lzplb" Apr 21 10:04:58.530503 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.530431 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:58.532629 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.532605 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 10:04:58.543184 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.543157 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/66ec4bf1-14e6-42c4-9174-6a6f20406a1c-original-pull-secret\") pod \"global-pull-secret-syncer-kknsw\" (UID: \"66ec4bf1-14e6-42c4-9174-6a6f20406a1c\") " pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:58.575557 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.575527 2567 generic.go:358] "Generic (PLEG): container finished" podID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerID="d57347797ecd85da47d2418b1772f9514d53a6a9e2b6c81cc3581571354c12b5" exitCode=0 Apr 21 10:04:58.575698 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.575616 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerDied","Data":"d57347797ecd85da47d2418b1772f9514d53a6a9e2b6c81cc3581571354c12b5"} Apr 21 10:04:58.575698 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.575655 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerStarted","Data":"091d50bd447fde9bab39d1888ae54db989012fb138ee1c1af6f78d12ccf5dd13"} Apr 21 10:04:58.577074 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.577033 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vrs72" event={"ID":"1b8b33c9-4316-4863-843f-730d4490910b","Type":"ContainerStarted","Data":"1057b0e4f4b93ae1687f4fcbf577890986bd4202c179e32bf6ead643f197d620"} Apr 21 10:04:58.578443 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.578415 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lf67k" event={"ID":"8cb513a2-6bf9-465c-bac3-8b87096c0e4e","Type":"ContainerStarted","Data":"e6c6fdf6da4748177b633e0c2e32271db75b9f624a89b4d3841a6776f5fa3f3f"} Apr 21 10:04:58.763392 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:58.763352 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kknsw" Apr 21 10:04:59.267142 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:59.267078 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kknsw"] Apr 21 10:04:59.273609 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:04:59.273578 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66ec4bf1_14e6_42c4_9174_6a6f20406a1c.slice/crio-8da29b24ad1fb34f4ace01f2b634a4612a1214411e4dd2129dbe1f318391315e WatchSource:0}: Error finding container 8da29b24ad1fb34f4ace01f2b634a4612a1214411e4dd2129dbe1f318391315e: Status 404 returned error can't find the container with id 8da29b24ad1fb34f4ace01f2b634a4612a1214411e4dd2129dbe1f318391315e Apr 21 10:04:59.584551 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:59.583709 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv" event={"ID":"f2b092e0-5084-4ac8-a541-ebb66bf667a2","Type":"ContainerStarted","Data":"6b5703ab4a570149185a5cb49886f0e29c77cbdc3e4417d941ac62ba307c7fe9"} Apr 21 10:04:59.584551 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:59.584444 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv" Apr 21 10:04:59.587977 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:59.587906 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" event={"ID":"cffa9bea-51ed-4ef6-a644-9598dc5dcc73","Type":"ContainerStarted","Data":"764c5b14d30b79e117cdcd29725f1ca8f4461080fcb1be2cb867b316d3b2f3f6"} Apr 21 10:04:59.587977 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:59.587940 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" event={"ID":"cffa9bea-51ed-4ef6-a644-9598dc5dcc73","Type":"ContainerStarted","Data":"f122ffb19ea43de651bf46adb79f921d9e364af7df638c9637dfe320a74f1b16"} Apr 21 10:04:59.587977 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:59.587957 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" event={"ID":"cffa9bea-51ed-4ef6-a644-9598dc5dcc73","Type":"ContainerStarted","Data":"e6bcb20e62c26ba5a0778e122fa266181e040eedf79f90d5e4d1735b88a4b521"} Apr 21 10:04:59.590077 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:59.590033 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kknsw" event={"ID":"66ec4bf1-14e6-42c4-9174-6a6f20406a1c","Type":"ContainerStarted","Data":"8da29b24ad1fb34f4ace01f2b634a4612a1214411e4dd2129dbe1f318391315e"} Apr 21 10:04:59.590851 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:59.590831 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv" Apr 21 10:04:59.620587 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:59.620539 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-j7fvv" podStartSLOduration=1.913475811 podStartE2EDuration="3.620521932s" podCreationTimestamp="2026-04-21 10:04:56 +0000 UTC" firstStartedPulling="2026-04-21 10:04:57.417562687 +0000 UTC m=+65.762012551" lastFinishedPulling="2026-04-21 10:04:59.124608795 +0000 UTC m=+67.469058672" observedRunningTime="2026-04-21 10:04:59.601822775 +0000 UTC m=+67.946272658" watchObservedRunningTime="2026-04-21 10:04:59.620521932 +0000 UTC m=+67.964971818" Apr 21 10:04:59.647273 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:04:59.647186 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-75fdf7498f-rzddk" podStartSLOduration=1.5464899650000001 podStartE2EDuration="3.647169906s" podCreationTimestamp="2026-04-21 10:04:56 +0000 UTC" firstStartedPulling="2026-04-21 10:04:57.029483567 +0000 UTC m=+65.373933445" lastFinishedPulling="2026-04-21 10:04:59.13016352 +0000 UTC m=+67.474613386" observedRunningTime="2026-04-21 10:04:59.644678975 +0000 UTC m=+67.989128861" watchObservedRunningTime="2026-04-21 10:04:59.647169906 +0000 UTC m=+67.991619793" Apr 21 10:05:00.596252 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:00.596159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vrs72" event={"ID":"1b8b33c9-4316-4863-843f-730d4490910b","Type":"ContainerStarted","Data":"e3be1bf825d88910a9be0c9365e37d3e414c794bff2d2b403c97177a9020efb9"} Apr 21 10:05:00.596252 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:00.596202 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vrs72" event={"ID":"1b8b33c9-4316-4863-843f-730d4490910b","Type":"ContainerStarted","Data":"6fe378f2fd3c1189d68ebd7134384f3686249850697d266668a78fe6cfd7b785"} Apr 21 10:05:00.611768 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:00.611707 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vrs72" podStartSLOduration=66.861155993 podStartE2EDuration="1m8.611689041s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:04:58.296898618 +0000 UTC m=+66.641348484" lastFinishedPulling="2026-04-21 10:05:00.047431666 +0000 UTC m=+68.391881532" observedRunningTime="2026-04-21 10:05:00.610819923 +0000 UTC m=+68.955269811" watchObservedRunningTime="2026-04-21 10:05:00.611689041 +0000 UTC m=+68.956138927" Apr 21 10:05:04.611977 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:04.611940 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerStarted","Data":"af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd"} Apr 21 10:05:04.612502 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:04.612477 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerStarted","Data":"484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802"} Apr 21 10:05:04.613638 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:04.613610 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lf67k" event={"ID":"8cb513a2-6bf9-465c-bac3-8b87096c0e4e","Type":"ContainerStarted","Data":"90148e6eb4c8fba5bb20fd967ba08f0d7814076612b187a7aced36dd0c643bdc"} Apr 21 10:05:04.614003 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:04.613979 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:05:04.615228 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:04.615192 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kknsw" event={"ID":"66ec4bf1-14e6-42c4-9174-6a6f20406a1c","Type":"ContainerStarted","Data":"43f16889e36f4200a4dd5a564d45bb059b3d9663ece272c965999b177952429c"} Apr 21 10:05:04.645205 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:04.644962 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-lf67k" podStartSLOduration=66.466254992 podStartE2EDuration="1m12.644941571s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:04:58.239545297 +0000 UTC m=+66.583995164" lastFinishedPulling="2026-04-21 10:05:04.41823188 +0000 UTC m=+72.762681743" observedRunningTime="2026-04-21 10:05:04.629539536 +0000 UTC m=+72.973989421" watchObservedRunningTime="2026-04-21 10:05:04.644941571 +0000 UTC m=+72.989391437" Apr 21 10:05:04.646025 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:04.645983 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kknsw" podStartSLOduration=65.492010075 podStartE2EDuration="1m10.645971187s" podCreationTimestamp="2026-04-21 10:03:54 +0000 UTC" firstStartedPulling="2026-04-21 10:04:59.275856857 +0000 UTC m=+67.620306722" lastFinishedPulling="2026-04-21 10:05:04.429817956 +0000 UTC m=+72.774267834" observedRunningTime="2026-04-21 10:05:04.644895218 +0000 UTC m=+72.989345121" watchObservedRunningTime="2026-04-21 10:05:04.645971187 +0000 UTC m=+72.990421080" Apr 21 10:05:05.507977 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:05.507947 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7776896ff5-2q4gp" Apr 21 10:05:06.628536 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:06.628506 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerStarted","Data":"0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee"} Apr 21 10:05:06.628536 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:06.628539 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerStarted","Data":"03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103"} Apr 21 10:05:07.634992 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:07.634957 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerStarted","Data":"03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349"} Apr 21 10:05:07.634992 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:07.634999 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerStarted","Data":"0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97"} Apr 21 10:05:07.675488 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:07.675428 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.787742933 podStartE2EDuration="10.675408382s" podCreationTimestamp="2026-04-21 10:04:57 +0000 UTC" firstStartedPulling="2026-04-21 10:04:58.576984375 +0000 UTC m=+66.921434254" lastFinishedPulling="2026-04-21 10:05:06.464649841 +0000 UTC m=+74.809099703" observedRunningTime="2026-04-21 10:05:07.672933343 +0000 UTC m=+76.017383228" watchObservedRunningTime="2026-04-21 10:05:07.675408382 +0000 UTC m=+76.019858268" Apr 21 10:05:07.923672 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:07.923583 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:05:15.895535 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:15.895419 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:05:15.895535 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:15.895462 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:05:35.619987 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:35.619953 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-lf67k" Apr 21 10:05:35.901288 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:35.901215 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:05:35.905106 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:35.905080 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5747f67664-2t59r" Apr 21 10:05:57.923839 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:57.923804 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:05:57.942791 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:57.942766 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:05:58.795470 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:05:58.795436 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:11.615809 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.615776 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:06:11.616351 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.616275 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="alertmanager" containerID="cri-o://37246dcfc446cbd2bc65c0996f11a2a626ef4643bf4c64318dcd88d279c6d858" gracePeriod=120 Apr 21 10:06:11.616351 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.616298 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy-metric" containerID="cri-o://8c8602f9a3b7a6eed6aca56d7c48aef67c87f27261a9c847556b768116d8f1e2" gracePeriod=120 Apr 21 10:06:11.616468 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.616331 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy-web" containerID="cri-o://ca1b578c57ff9886f0597b5062de5f2c45b0484aaca522dff0d3d0f2809507a2" gracePeriod=120 Apr 21 10:06:11.616468 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.616369 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy" containerID="cri-o://760355aa7c2d9c19e3efa331d362b1f9f314d8cc39d1f8a08851415bddb39c47" gracePeriod=120 Apr 21 10:06:11.616468 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.616369 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="prom-label-proxy" containerID="cri-o://99cb2a804ec16d6bf846841a6b8ce5e1f9772da79cee9b11448e905e32a54c9d" gracePeriod=120 Apr 21 10:06:11.616468 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.616333 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="config-reloader" containerID="cri-o://27e85b104060e0271db6297fc4f35cefa5bcd435e1046e9018d5067ff5c13c2a" gracePeriod=120 Apr 21 10:06:11.825323 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.825291 2567 generic.go:358] "Generic (PLEG): container finished" podID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerID="99cb2a804ec16d6bf846841a6b8ce5e1f9772da79cee9b11448e905e32a54c9d" exitCode=0 Apr 21 10:06:11.825323 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.825314 2567 generic.go:358] "Generic (PLEG): container finished" podID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerID="8c8602f9a3b7a6eed6aca56d7c48aef67c87f27261a9c847556b768116d8f1e2" exitCode=0 Apr 21 10:06:11.825323 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.825321 2567 generic.go:358] "Generic (PLEG): container finished" podID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerID="760355aa7c2d9c19e3efa331d362b1f9f314d8cc39d1f8a08851415bddb39c47" exitCode=0 Apr 21 10:06:11.825323 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.825326 2567 generic.go:358] "Generic (PLEG): container finished" podID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerID="27e85b104060e0271db6297fc4f35cefa5bcd435e1046e9018d5067ff5c13c2a" exitCode=0 Apr 21 10:06:11.825323 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.825331 2567 generic.go:358] "Generic (PLEG): container finished" podID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerID="37246dcfc446cbd2bc65c0996f11a2a626ef4643bf4c64318dcd88d279c6d858" exitCode=0 Apr 21 10:06:11.825583 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.825359 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerDied","Data":"99cb2a804ec16d6bf846841a6b8ce5e1f9772da79cee9b11448e905e32a54c9d"} Apr 21 10:06:11.825583 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.825391 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerDied","Data":"8c8602f9a3b7a6eed6aca56d7c48aef67c87f27261a9c847556b768116d8f1e2"} Apr 21 10:06:11.825583 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.825402 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerDied","Data":"760355aa7c2d9c19e3efa331d362b1f9f314d8cc39d1f8a08851415bddb39c47"} Apr 21 10:06:11.825583 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.825411 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerDied","Data":"27e85b104060e0271db6297fc4f35cefa5bcd435e1046e9018d5067ff5c13c2a"} Apr 21 10:06:11.825583 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:11.825420 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerDied","Data":"37246dcfc446cbd2bc65c0996f11a2a626ef4643bf4c64318dcd88d279c6d858"} Apr 21 10:06:12.832173 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.832142 2567 generic.go:358] "Generic (PLEG): container finished" podID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerID="ca1b578c57ff9886f0597b5062de5f2c45b0484aaca522dff0d3d0f2809507a2" exitCode=0 Apr 21 10:06:12.832534 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.832219 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerDied","Data":"ca1b578c57ff9886f0597b5062de5f2c45b0484aaca522dff0d3d0f2809507a2"} Apr 21 10:06:12.855353 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.855328 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:12.927460 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927431 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-tls-assets\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927460 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927465 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7pwc\" (UniqueName: \"kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-kube-api-access-v7pwc\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927689 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927489 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-config-out\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927689 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927529 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927689 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927575 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-main-db\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927689 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927607 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-trusted-ca-bundle\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927689 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927636 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-metrics-client-ca\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927689 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927660 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-metric\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927994 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927698 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-config-volume\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927994 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927755 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-main-tls\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927994 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927797 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-cluster-tls-config\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927994 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927825 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-web-config\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.927994 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.927881 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-web\") pod \"8d96bb4c-72ad-4b48-b738-13a156de3777\" (UID: \"8d96bb4c-72ad-4b48-b738-13a156de3777\") " Apr 21 10:06:12.928263 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.928135 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:06:12.928920 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.928498 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:06:12.928920 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.928871 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:06:12.931453 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.931391 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:06:12.931570 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.931463 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-config-out" (OuterVolumeSpecName: "config-out") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:06:12.932071 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.931971 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:12.932071 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.932045 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-kube-api-access-v7pwc" (OuterVolumeSpecName: "kube-api-access-v7pwc") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "kube-api-access-v7pwc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:06:12.932372 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.932314 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:12.932372 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.932336 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:12.932729 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.932703 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:12.933026 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.933002 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-config-volume" (OuterVolumeSpecName: "config-volume") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:12.937941 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.937921 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:12.944104 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:12.944045 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-web-config" (OuterVolumeSpecName: "web-config") pod "8d96bb4c-72ad-4b48-b738-13a156de3777" (UID: "8d96bb4c-72ad-4b48-b738-13a156de3777"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:13.028708 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028652 2567 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-main-db\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.028708 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028697 2567 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.028708 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028710 2567 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d96bb4c-72ad-4b48-b738-13a156de3777-metrics-client-ca\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.028708 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028720 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.028708 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028730 2567 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-config-volume\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.029010 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028739 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-main-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.029010 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028749 2567 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-cluster-tls-config\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.029010 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028759 2567 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-web-config\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.029010 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028768 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.029010 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028776 2567 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-tls-assets\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.029010 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028784 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v7pwc\" (UniqueName: \"kubernetes.io/projected/8d96bb4c-72ad-4b48-b738-13a156de3777-kube-api-access-v7pwc\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.029010 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028793 2567 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d96bb4c-72ad-4b48-b738-13a156de3777-config-out\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.029010 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.028802 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8d96bb4c-72ad-4b48-b738-13a156de3777-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:13.838279 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.838240 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8d96bb4c-72ad-4b48-b738-13a156de3777","Type":"ContainerDied","Data":"631bca38ef677bef17362b88f069b2ad98321a72152e8e5163d621cd59e015a2"} Apr 21 10:06:13.838716 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.838301 2567 scope.go:117] "RemoveContainer" containerID="99cb2a804ec16d6bf846841a6b8ce5e1f9772da79cee9b11448e905e32a54c9d" Apr 21 10:06:13.838716 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.838300 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:13.847256 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.847238 2567 scope.go:117] "RemoveContainer" containerID="8c8602f9a3b7a6eed6aca56d7c48aef67c87f27261a9c847556b768116d8f1e2" Apr 21 10:06:13.853637 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.853617 2567 scope.go:117] "RemoveContainer" containerID="760355aa7c2d9c19e3efa331d362b1f9f314d8cc39d1f8a08851415bddb39c47" Apr 21 10:06:13.859933 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.859912 2567 scope.go:117] "RemoveContainer" containerID="ca1b578c57ff9886f0597b5062de5f2c45b0484aaca522dff0d3d0f2809507a2" Apr 21 10:06:13.862965 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.862942 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:06:13.872672 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.867721 2567 scope.go:117] "RemoveContainer" containerID="27e85b104060e0271db6297fc4f35cefa5bcd435e1046e9018d5067ff5c13c2a" Apr 21 10:06:13.872672 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.871013 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:06:13.880606 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.880571 2567 scope.go:117] "RemoveContainer" containerID="37246dcfc446cbd2bc65c0996f11a2a626ef4643bf4c64318dcd88d279c6d858" Apr 21 10:06:13.888453 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.888433 2567 scope.go:117] "RemoveContainer" containerID="49e91b6eded9a17bd540233a7fc7c43a52dc4d41d20ff21699de17e9c5dbe213" Apr 21 10:06:13.897219 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897189 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:06:13.897544 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897527 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="prom-label-proxy" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897547 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="prom-label-proxy" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897560 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="init-config-reloader" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897567 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="init-config-reloader" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897575 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897580 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897588 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy-metric" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897594 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy-metric" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897604 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="config-reloader" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897609 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="config-reloader" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897615 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy-web" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897620 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy-web" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897631 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="alertmanager" Apr 21 10:06:13.897632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897636 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="alertmanager" Apr 21 10:06:13.898086 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897677 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="config-reloader" Apr 21 10:06:13.898086 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897685 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="prom-label-proxy" Apr 21 10:06:13.898086 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897693 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy-metric" Apr 21 10:06:13.898086 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897702 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="alertmanager" Apr 21 10:06:13.898086 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897711 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy-web" Apr 21 10:06:13.898086 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.897722 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" containerName="kube-rbac-proxy" Apr 21 10:06:13.903105 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.903085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:13.905741 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.905716 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 10:06:13.905852 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.905717 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 10:06:13.906014 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.905996 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 10:06:13.906014 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.906006 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 10:06:13.906200 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.906007 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 10:06:13.906200 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.906149 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 10:06:13.906200 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.906166 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-zslxs\"" Apr 21 10:06:13.906200 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.906180 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 10:06:13.906364 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.906348 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 10:06:13.911332 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.911192 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 10:06:13.912097 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:13.912054 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:06:14.037804 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.037772 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.037964 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.037813 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-web-config\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.037964 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.037836 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-config-volume\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.037964 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.037919 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.037964 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.037958 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.038096 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.037985 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.038096 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.038017 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/abd28c2e-ea3c-490a-8407-1bf197a81d99-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.038096 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.038059 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fcxd\" (UniqueName: \"kubernetes.io/projected/abd28c2e-ea3c-490a-8407-1bf197a81d99-kube-api-access-2fcxd\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.038220 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.038100 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/abd28c2e-ea3c-490a-8407-1bf197a81d99-tls-assets\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.038220 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.038169 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/abd28c2e-ea3c-490a-8407-1bf197a81d99-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.038220 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.038190 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abd28c2e-ea3c-490a-8407-1bf197a81d99-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.038312 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.038227 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.038312 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.038266 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/abd28c2e-ea3c-490a-8407-1bf197a81d99-config-out\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.138761 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.138730 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.138884 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.138774 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/abd28c2e-ea3c-490a-8407-1bf197a81d99-config-out\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.138921 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.138893 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.138956 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.138922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-web-config\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.138956 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.138948 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-config-volume\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.139055 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.138978 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.139055 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.139011 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.139055 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.139039 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.139236 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.139076 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/abd28c2e-ea3c-490a-8407-1bf197a81d99-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.139236 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.139156 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcxd\" (UniqueName: \"kubernetes.io/projected/abd28c2e-ea3c-490a-8407-1bf197a81d99-kube-api-access-2fcxd\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.139236 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.139189 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/abd28c2e-ea3c-490a-8407-1bf197a81d99-tls-assets\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.139384 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.139235 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/abd28c2e-ea3c-490a-8407-1bf197a81d99-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.139384 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.139260 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abd28c2e-ea3c-490a-8407-1bf197a81d99-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.140290 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.140262 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/abd28c2e-ea3c-490a-8407-1bf197a81d99-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.140290 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.140285 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abd28c2e-ea3c-490a-8407-1bf197a81d99-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.140853 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.140822 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/abd28c2e-ea3c-490a-8407-1bf197a81d99-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.141835 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.141786 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/abd28c2e-ea3c-490a-8407-1bf197a81d99-config-out\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.142087 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.142043 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-config-volume\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.142212 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.142136 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.142274 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.142224 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.142624 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.142601 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-web-config\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.142747 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.142724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.142932 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.142911 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/abd28c2e-ea3c-490a-8407-1bf197a81d99-tls-assets\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.143392 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.143374 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.143568 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.143551 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/abd28c2e-ea3c-490a-8407-1bf197a81d99-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.147463 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.147443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fcxd\" (UniqueName: \"kubernetes.io/projected/abd28c2e-ea3c-490a-8407-1bf197a81d99-kube-api-access-2fcxd\") pod \"alertmanager-main-0\" (UID: \"abd28c2e-ea3c-490a-8407-1bf197a81d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.215135 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.215067 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 10:06:14.257551 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.257512 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d96bb4c-72ad-4b48-b738-13a156de3777" path="/var/lib/kubelet/pods/8d96bb4c-72ad-4b48-b738-13a156de3777/volumes" Apr 21 10:06:14.343872 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.343849 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 10:06:14.346240 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:06:14.346212 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabd28c2e_ea3c_490a_8407_1bf197a81d99.slice/crio-744aa8bf547bc1dccec8814e67d7032f00fd5d15e20de42ddd4917db8acca0b2 WatchSource:0}: Error finding container 744aa8bf547bc1dccec8814e67d7032f00fd5d15e20de42ddd4917db8acca0b2: Status 404 returned error can't find the container with id 744aa8bf547bc1dccec8814e67d7032f00fd5d15e20de42ddd4917db8acca0b2 Apr 21 10:06:14.842806 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.842770 2567 generic.go:358] "Generic (PLEG): container finished" podID="abd28c2e-ea3c-490a-8407-1bf197a81d99" containerID="95b3b35d0a41eeba5e80585dc266092c7cccc977a3f9e93e866c43d8b6e23b08" exitCode=0 Apr 21 10:06:14.843233 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.842856 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"abd28c2e-ea3c-490a-8407-1bf197a81d99","Type":"ContainerDied","Data":"95b3b35d0a41eeba5e80585dc266092c7cccc977a3f9e93e866c43d8b6e23b08"} Apr 21 10:06:14.843233 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:14.842903 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"abd28c2e-ea3c-490a-8407-1bf197a81d99","Type":"ContainerStarted","Data":"744aa8bf547bc1dccec8814e67d7032f00fd5d15e20de42ddd4917db8acca0b2"} Apr 21 10:06:15.840404 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.840368 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:06:15.840961 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.840929 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="prometheus" containerID="cri-o://484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802" gracePeriod=600 Apr 21 10:06:15.841041 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.840944 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy" containerID="cri-o://0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97" gracePeriod=600 Apr 21 10:06:15.841041 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.840973 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="thanos-sidecar" containerID="cri-o://03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103" gracePeriod=600 Apr 21 10:06:15.841041 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.841001 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy-thanos" containerID="cri-o://03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349" gracePeriod=600 Apr 21 10:06:15.841198 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.840982 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy-web" containerID="cri-o://0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee" gracePeriod=600 Apr 21 10:06:15.841198 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.841005 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="config-reloader" containerID="cri-o://af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd" gracePeriod=600 Apr 21 10:06:15.859828 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.859801 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"abd28c2e-ea3c-490a-8407-1bf197a81d99","Type":"ContainerStarted","Data":"862293af9c40576735b98f42995bdd3f5451d450cd897a16f2d19e1a853a2fd8"} Apr 21 10:06:15.860170 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.859838 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"abd28c2e-ea3c-490a-8407-1bf197a81d99","Type":"ContainerStarted","Data":"ed48fdad68d57ef116441ab49a90ced308cf3836276eeb427dcfab660a4bcedb"} Apr 21 10:06:15.860170 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.859851 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"abd28c2e-ea3c-490a-8407-1bf197a81d99","Type":"ContainerStarted","Data":"cf8fca0a80cdcd150cae745dd2b90f6e8b8b45d5929a722cd6db8947ed130ff9"} Apr 21 10:06:15.860170 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.859864 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"abd28c2e-ea3c-490a-8407-1bf197a81d99","Type":"ContainerStarted","Data":"e00d62063063c25408e4207e2d9c9688bdf1c9a497b7ce0013f69f6b3a715651"} Apr 21 10:06:15.860170 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.859875 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"abd28c2e-ea3c-490a-8407-1bf197a81d99","Type":"ContainerStarted","Data":"a4a72d8fa4732a638e50e8098258a99bc7657c5be36ad98a534044d7d745a5b0"} Apr 21 10:06:15.860170 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.859887 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"abd28c2e-ea3c-490a-8407-1bf197a81d99","Type":"ContainerStarted","Data":"14d26a565367d812f0b723cd68a2a6c0c4e56df12cef3822577e288a09068fb0"} Apr 21 10:06:15.885399 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:15.885362 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.885347049 podStartE2EDuration="2.885347049s" podCreationTimestamp="2026-04-21 10:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:06:15.883476944 +0000 UTC m=+144.227926829" watchObservedRunningTime="2026-04-21 10:06:15.885347049 +0000 UTC m=+144.229796934" Apr 21 10:06:16.868377 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:16.866946 2567 generic.go:358] "Generic (PLEG): container finished" podID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerID="03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349" exitCode=0 Apr 21 10:06:16.868377 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:16.866977 2567 generic.go:358] "Generic (PLEG): container finished" podID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerID="0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97" exitCode=0 Apr 21 10:06:16.868377 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:16.866985 2567 generic.go:358] "Generic (PLEG): container finished" podID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerID="03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103" exitCode=0 Apr 21 10:06:16.868377 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:16.866993 2567 generic.go:358] "Generic (PLEG): container finished" podID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerID="af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd" exitCode=0 Apr 21 10:06:16.868377 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:16.867000 2567 generic.go:358] "Generic (PLEG): container finished" podID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerID="484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802" exitCode=0 Apr 21 10:06:16.868377 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:16.868202 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerDied","Data":"03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349"} Apr 21 10:06:16.868377 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:16.868237 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerDied","Data":"0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97"} Apr 21 10:06:16.868377 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:16.868253 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerDied","Data":"03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103"} Apr 21 10:06:16.868377 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:16.868268 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerDied","Data":"af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd"} Apr 21 10:06:16.868377 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:16.868280 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerDied","Data":"484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802"} Apr 21 10:06:17.076143 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.076098 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:17.166601 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.166574 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-rulefiles-0\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.166741 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.166621 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-serving-certs-ca-bundle\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.166741 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.166653 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-db\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.166869 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.166834 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-tls\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.166924 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.166888 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-kube-rbac-proxy\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.166924 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.166914 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-metrics-client-ca\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167021 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.166938 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-metrics-client-certs\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167021 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.166973 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167021 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167012 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8k89\" (UniqueName: \"kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-kube-api-access-j8k89\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167190 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167030 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:06:17.167190 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167036 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-trusted-ca-bundle\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167190 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167091 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-grpc-tls\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167336 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167146 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-web-config\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167336 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167278 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-kubelet-serving-ca-bundle\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167336 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167312 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config-out\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167486 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167343 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:06:17.167486 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167349 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-tls-assets\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167486 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167371 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:06:17.167486 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167415 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167486 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167446 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167486 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167473 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-thanos-prometheus-http-client-file\") pod \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\" (UID: \"62fcbae9-5553-48a3-93d6-a940e5ec6fff\") " Apr 21 10:06:17.167784 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167762 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.167843 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167790 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-metrics-client-ca\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.167843 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167806 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.167946 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.167898 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:06:17.168416 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.168385 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:06:17.168835 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.168807 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:06:17.171254 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.171181 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:17.171677 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.171634 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:17.171792 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.171725 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:06:17.171856 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.171835 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:17.172098 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.172070 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config-out" (OuterVolumeSpecName: "config-out") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:06:17.172098 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.172072 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config" (OuterVolumeSpecName: "config") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:17.172301 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.172154 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-kube-api-access-j8k89" (OuterVolumeSpecName: "kube-api-access-j8k89") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "kube-api-access-j8k89". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:06:17.172301 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.172181 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:17.172301 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.172261 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:17.172415 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.172323 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:17.172415 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.172399 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:17.181270 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.181248 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-web-config" (OuterVolumeSpecName: "web-config") pod "62fcbae9-5553-48a3-93d6-a940e5ec6fff" (UID: "62fcbae9-5553-48a3-93d6-a940e5ec6fff"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:06:17.268169 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268099 2567 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-tls-assets\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268169 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268136 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268169 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268146 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268169 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268156 2567 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268169 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268166 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268175 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-prometheus-k8s-db\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268184 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268193 2567 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-kube-rbac-proxy\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268202 2567 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-metrics-client-certs\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268210 2567 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268219 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8k89\" (UniqueName: \"kubernetes.io/projected/62fcbae9-5553-48a3-93d6-a940e5ec6fff-kube-api-access-j8k89\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268228 2567 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-secret-grpc-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268236 2567 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/62fcbae9-5553-48a3-93d6-a940e5ec6fff-web-config\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268243 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fcbae9-5553-48a3-93d6-a940e5ec6fff-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.268370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.268252 2567 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/62fcbae9-5553-48a3-93d6-a940e5ec6fff-config-out\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:06:17.872848 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.872816 2567 generic.go:358] "Generic (PLEG): container finished" podID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerID="0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee" exitCode=0 Apr 21 10:06:17.873238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.872892 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerDied","Data":"0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee"} Apr 21 10:06:17.873238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.872911 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:17.873238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.872929 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"62fcbae9-5553-48a3-93d6-a940e5ec6fff","Type":"ContainerDied","Data":"091d50bd447fde9bab39d1888ae54db989012fb138ee1c1af6f78d12ccf5dd13"} Apr 21 10:06:17.873238 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.872946 2567 scope.go:117] "RemoveContainer" containerID="03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349" Apr 21 10:06:17.880297 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.880276 2567 scope.go:117] "RemoveContainer" containerID="0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97" Apr 21 10:06:17.886767 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.886751 2567 scope.go:117] "RemoveContainer" containerID="0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee" Apr 21 10:06:17.892907 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.892891 2567 scope.go:117] "RemoveContainer" containerID="03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103" Apr 21 10:06:17.896517 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.896496 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:06:17.899548 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.899521 2567 scope.go:117] "RemoveContainer" containerID="af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd" Apr 21 10:06:17.901567 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.901543 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:06:17.905653 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.905637 2567 scope.go:117] "RemoveContainer" containerID="484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802" Apr 21 10:06:17.911867 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.911847 2567 scope.go:117] "RemoveContainer" containerID="d57347797ecd85da47d2418b1772f9514d53a6a9e2b6c81cc3581571354c12b5" Apr 21 10:06:17.917966 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.917949 2567 scope.go:117] "RemoveContainer" containerID="03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349" Apr 21 10:06:17.918214 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:06:17.918195 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349\": container with ID starting with 03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349 not found: ID does not exist" containerID="03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349" Apr 21 10:06:17.918257 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.918221 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349"} err="failed to get container status \"03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349\": rpc error: code = NotFound desc = could not find container \"03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349\": container with ID starting with 03c2a296635913e56098c44b81c1e797497cf3b65c24168f8ef67aca3fa3a349 not found: ID does not exist" Apr 21 10:06:17.918257 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.918250 2567 scope.go:117] "RemoveContainer" containerID="0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97" Apr 21 10:06:17.918452 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:06:17.918437 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97\": container with ID starting with 0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97 not found: ID does not exist" containerID="0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97" Apr 21 10:06:17.918493 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.918458 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97"} err="failed to get container status \"0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97\": rpc error: code = NotFound desc = could not find container \"0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97\": container with ID starting with 0e5c0e836c84e33e03dcff506260dfdb0683c7cf165567727b72528218b8ff97 not found: ID does not exist" Apr 21 10:06:17.918493 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.918472 2567 scope.go:117] "RemoveContainer" containerID="0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee" Apr 21 10:06:17.918707 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:06:17.918689 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee\": container with ID starting with 0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee not found: ID does not exist" containerID="0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee" Apr 21 10:06:17.918755 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.918712 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee"} err="failed to get container status \"0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee\": rpc error: code = NotFound desc = could not find container \"0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee\": container with ID starting with 0a01db2f1a6f999ea060c95d51ab1b8200edb328c22c647044fbfff993c94cee not found: ID does not exist" Apr 21 10:06:17.918755 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.918733 2567 scope.go:117] "RemoveContainer" containerID="03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103" Apr 21 10:06:17.918927 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:06:17.918911 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103\": container with ID starting with 03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103 not found: ID does not exist" containerID="03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103" Apr 21 10:06:17.918978 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.918929 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103"} err="failed to get container status \"03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103\": rpc error: code = NotFound desc = could not find container \"03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103\": container with ID starting with 03e89c1750b7fc1718c7776d1076bf321524c4dfb4d84ab826d02cbf0a0b3103 not found: ID does not exist" Apr 21 10:06:17.918978 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.918941 2567 scope.go:117] "RemoveContainer" containerID="af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd" Apr 21 10:06:17.919205 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:06:17.919184 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd\": container with ID starting with af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd not found: ID does not exist" containerID="af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd" Apr 21 10:06:17.919256 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.919214 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd"} err="failed to get container status \"af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd\": rpc error: code = NotFound desc = could not find container \"af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd\": container with ID starting with af4b5fee7a97e671da1c86f3fbc636d942349a5330d504f3caa9dea6acf2fabd not found: ID does not exist" Apr 21 10:06:17.919256 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.919232 2567 scope.go:117] "RemoveContainer" containerID="484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802" Apr 21 10:06:17.919457 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:06:17.919441 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802\": container with ID starting with 484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802 not found: ID does not exist" containerID="484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802" Apr 21 10:06:17.919502 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.919460 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802"} err="failed to get container status \"484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802\": rpc error: code = NotFound desc = could not find container \"484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802\": container with ID starting with 484f70437ca7d410cbf318d03c559a0f03bf7b50ce09089031d2663706a44802 not found: ID does not exist" Apr 21 10:06:17.919502 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.919472 2567 scope.go:117] "RemoveContainer" containerID="d57347797ecd85da47d2418b1772f9514d53a6a9e2b6c81cc3581571354c12b5" Apr 21 10:06:17.919665 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:06:17.919650 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57347797ecd85da47d2418b1772f9514d53a6a9e2b6c81cc3581571354c12b5\": container with ID starting with d57347797ecd85da47d2418b1772f9514d53a6a9e2b6c81cc3581571354c12b5 not found: ID does not exist" containerID="d57347797ecd85da47d2418b1772f9514d53a6a9e2b6c81cc3581571354c12b5" Apr 21 10:06:17.919705 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.919665 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57347797ecd85da47d2418b1772f9514d53a6a9e2b6c81cc3581571354c12b5"} err="failed to get container status \"d57347797ecd85da47d2418b1772f9514d53a6a9e2b6c81cc3581571354c12b5\": rpc error: code = NotFound desc = could not find container \"d57347797ecd85da47d2418b1772f9514d53a6a9e2b6c81cc3581571354c12b5\": container with ID starting with d57347797ecd85da47d2418b1772f9514d53a6a9e2b6c81cc3581571354c12b5 not found: ID does not exist" Apr 21 10:06:17.922814 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.922795 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:06:17.923081 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923070 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="config-reloader" Apr 21 10:06:17.923137 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923083 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="config-reloader" Apr 21 10:06:17.923137 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923090 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="thanos-sidecar" Apr 21 10:06:17.923137 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923097 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="thanos-sidecar" Apr 21 10:06:17.923137 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923104 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="init-config-reloader" Apr 21 10:06:17.923137 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923125 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="init-config-reloader" Apr 21 10:06:17.923137 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923133 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy-web" Apr 21 10:06:17.923137 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923138 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy-web" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923146 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="prometheus" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923151 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="prometheus" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923156 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923162 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923180 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy-thanos" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923185 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy-thanos" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923228 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923237 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="thanos-sidecar" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923243 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="config-reloader" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923249 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy-web" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923255 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="kube-rbac-proxy-thanos" Apr 21 10:06:17.923352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.923261 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" containerName="prometheus" Apr 21 10:06:17.928291 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.928274 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:17.931312 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.931292 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 10:06:17.931442 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.931355 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 10:06:17.931442 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.931292 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 10:06:17.931442 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.931361 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wnq4m\"" Apr 21 10:06:17.931442 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.931398 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 10:06:17.931665 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.931511 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 10:06:17.931665 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.931531 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 10:06:17.931665 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.931645 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 10:06:17.931758 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.931711 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ma6a69sqc1ei\"" Apr 21 10:06:17.931758 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.931711 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 10:06:17.932093 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.932079 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 10:06:17.932386 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.932368 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 10:06:17.935891 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.935871 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 10:06:17.946691 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.946667 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 10:06:17.961358 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:17.961312 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:06:18.073972 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.073947 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074081 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.073981 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074081 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074001 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074226 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074071 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074226 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074142 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-web-config\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074226 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074172 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074226 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074197 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074226 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074216 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074367 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074237 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-config\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074367 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074270 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074367 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074308 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074367 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074337 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ab962edd-a3ba-4beb-aff2-b9311b2938aa-config-out\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074367 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074360 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074521 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074377 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074521 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074393 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074521 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074440 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ab962edd-a3ba-4beb-aff2-b9311b2938aa-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074521 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074455 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kmjr\" (UniqueName: \"kubernetes.io/projected/ab962edd-a3ba-4beb-aff2-b9311b2938aa-kube-api-access-6kmjr\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.074521 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.074474 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ab962edd-a3ba-4beb-aff2-b9311b2938aa-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.174861 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.174828 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.174861 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.174865 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175030 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.174919 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175030 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.174948 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-web-config\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175030 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.174969 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175030 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175006 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175256 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175034 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175256 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175070 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-config\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175256 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175144 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175256 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175183 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175256 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175218 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ab962edd-a3ba-4beb-aff2-b9311b2938aa-config-out\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175256 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175242 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175550 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175261 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175550 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175283 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175550 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175316 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ab962edd-a3ba-4beb-aff2-b9311b2938aa-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175550 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175338 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kmjr\" (UniqueName: \"kubernetes.io/projected/ab962edd-a3ba-4beb-aff2-b9311b2938aa-kube-api-access-6kmjr\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175550 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175365 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ab962edd-a3ba-4beb-aff2-b9311b2938aa-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175550 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175410 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.175920 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.175898 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.177248 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.176026 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.177248 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.176489 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.177248 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.176900 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.178536 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.177773 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ab962edd-a3ba-4beb-aff2-b9311b2938aa-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.178536 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.178410 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-config\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.178536 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.178482 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-web-config\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.178759 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.178694 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.179177 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.179141 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.179392 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.179355 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ab962edd-a3ba-4beb-aff2-b9311b2938aa-config-out\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.179506 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.179445 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ab962edd-a3ba-4beb-aff2-b9311b2938aa-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.179883 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.179855 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.180144 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.180123 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.180268 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.180249 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.180903 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.180881 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ab962edd-a3ba-4beb-aff2-b9311b2938aa-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.181004 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.180986 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.181155 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.181140 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ab962edd-a3ba-4beb-aff2-b9311b2938aa-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.185374 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.185352 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kmjr\" (UniqueName: \"kubernetes.io/projected/ab962edd-a3ba-4beb-aff2-b9311b2938aa-kube-api-access-6kmjr\") pod \"prometheus-k8s-0\" (UID: \"ab962edd-a3ba-4beb-aff2-b9311b2938aa\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.238631 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.238578 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:06:18.257423 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.257396 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62fcbae9-5553-48a3-93d6-a940e5ec6fff" path="/var/lib/kubelet/pods/62fcbae9-5553-48a3-93d6-a940e5ec6fff/volumes" Apr 21 10:06:18.361361 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.361338 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 10:06:18.363250 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:06:18.363224 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab962edd_a3ba_4beb_aff2_b9311b2938aa.slice/crio-98690435b2d5c4d4b5dffde0b8bfac5b0800eb598f7657a89391d66c8e128c44 WatchSource:0}: Error finding container 98690435b2d5c4d4b5dffde0b8bfac5b0800eb598f7657a89391d66c8e128c44: Status 404 returned error can't find the container with id 98690435b2d5c4d4b5dffde0b8bfac5b0800eb598f7657a89391d66c8e128c44 Apr 21 10:06:18.877466 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.877427 2567 generic.go:358] "Generic (PLEG): container finished" podID="ab962edd-a3ba-4beb-aff2-b9311b2938aa" containerID="37f5e8f0bec7d4a3f7b73e2286383ed865728949e9dbf7ce677ddc6f27c419de" exitCode=0 Apr 21 10:06:18.877894 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.877511 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ab962edd-a3ba-4beb-aff2-b9311b2938aa","Type":"ContainerDied","Data":"37f5e8f0bec7d4a3f7b73e2286383ed865728949e9dbf7ce677ddc6f27c419de"} Apr 21 10:06:18.877894 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:18.877544 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ab962edd-a3ba-4beb-aff2-b9311b2938aa","Type":"ContainerStarted","Data":"98690435b2d5c4d4b5dffde0b8bfac5b0800eb598f7657a89391d66c8e128c44"} Apr 21 10:06:19.888803 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:19.888762 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ab962edd-a3ba-4beb-aff2-b9311b2938aa","Type":"ContainerStarted","Data":"a89e73404f7e42fa9b4474abc3f08964d897114d33c95798526bc96455e41f0d"} Apr 21 10:06:19.889277 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:19.888811 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ab962edd-a3ba-4beb-aff2-b9311b2938aa","Type":"ContainerStarted","Data":"7fcb0afe74a7427518cacead4cbbcdb4a038bf84b40c3f1e91ec1ddf2fbd3094"} Apr 21 10:06:19.889277 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:19.888830 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ab962edd-a3ba-4beb-aff2-b9311b2938aa","Type":"ContainerStarted","Data":"5abfbaa689dfd06233cdb39f095817786096f0ee0100bd0b99b1af793c53fe32"} Apr 21 10:06:19.889277 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:19.888844 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ab962edd-a3ba-4beb-aff2-b9311b2938aa","Type":"ContainerStarted","Data":"ff47e11510edc4435c4ac2808c80b12bb6cfd3ab13c0b877afedc115d20a6164"} Apr 21 10:06:19.889277 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:19.888857 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ab962edd-a3ba-4beb-aff2-b9311b2938aa","Type":"ContainerStarted","Data":"16bd0bffcc14668b6f83558471234edf3b3c1ad2e2573b7dd54cad037bb143a4"} Apr 21 10:06:19.889277 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:19.888878 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ab962edd-a3ba-4beb-aff2-b9311b2938aa","Type":"ContainerStarted","Data":"32283538e601a37f90a53e9e895304f8681b7bb0885044fb61bb9f71e24c2ad3"} Apr 21 10:06:19.929682 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:19.929549 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.929530428 podStartE2EDuration="2.929530428s" podCreationTimestamp="2026-04-21 10:06:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:06:19.926925261 +0000 UTC m=+148.271375146" watchObservedRunningTime="2026-04-21 10:06:19.929530428 +0000 UTC m=+148.273980312" Apr 21 10:06:23.239194 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:06:23.239143 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:18.239013 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:07:18.238975 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:18.256487 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:07:18.256465 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:07:19.070227 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:07:19.070203 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 10:08:52.108424 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:08:52.108394 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:08:52.110682 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:08:52.110649 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:08:52.119199 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:08:52.119171 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 10:10:47.586150 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.586102 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-mm8cg"] Apr 21 10:10:47.589415 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.589394 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-mm8cg" Apr 21 10:10:47.592005 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.591980 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 10:10:47.592138 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.591993 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 10:10:47.592138 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.592061 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-mm59h\"" Apr 21 10:10:47.592896 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.592882 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 21 10:10:47.598343 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.598317 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-mm8cg"] Apr 21 10:10:47.603760 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.603741 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-gg64n"] Apr 21 10:10:47.606753 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.606738 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gg64n" Apr 21 10:10:47.609167 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.609148 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-qmw9t\"" Apr 21 10:10:47.609167 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.609161 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 21 10:10:47.616910 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.616891 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gg64n"] Apr 21 10:10:47.744215 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.744184 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7dc782-b5b4-40b6-9957-e0e724ebdbcd-cert\") pod \"odh-model-controller-696fc77849-gg64n\" (UID: \"ae7dc782-b5b4-40b6-9957-e0e724ebdbcd\") " pod="kserve/odh-model-controller-696fc77849-gg64n" Apr 21 10:10:47.744364 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.744228 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwh84\" (UniqueName: \"kubernetes.io/projected/ae7dc782-b5b4-40b6-9957-e0e724ebdbcd-kube-api-access-pwh84\") pod \"odh-model-controller-696fc77849-gg64n\" (UID: \"ae7dc782-b5b4-40b6-9957-e0e724ebdbcd\") " pod="kserve/odh-model-controller-696fc77849-gg64n" Apr 21 10:10:47.744364 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.744256 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5hz6\" (UniqueName: \"kubernetes.io/projected/f70e23cc-b117-459b-9f7d-bfd08eaf9280-kube-api-access-g5hz6\") pod \"model-serving-api-86f7b4b499-mm8cg\" (UID: \"f70e23cc-b117-459b-9f7d-bfd08eaf9280\") " pod="kserve/model-serving-api-86f7b4b499-mm8cg" Apr 21 10:10:47.744364 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.744280 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f70e23cc-b117-459b-9f7d-bfd08eaf9280-tls-certs\") pod \"model-serving-api-86f7b4b499-mm8cg\" (UID: \"f70e23cc-b117-459b-9f7d-bfd08eaf9280\") " pod="kserve/model-serving-api-86f7b4b499-mm8cg" Apr 21 10:10:47.845495 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.845425 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f70e23cc-b117-459b-9f7d-bfd08eaf9280-tls-certs\") pod \"model-serving-api-86f7b4b499-mm8cg\" (UID: \"f70e23cc-b117-459b-9f7d-bfd08eaf9280\") " pod="kserve/model-serving-api-86f7b4b499-mm8cg" Apr 21 10:10:47.845620 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.845508 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7dc782-b5b4-40b6-9957-e0e724ebdbcd-cert\") pod \"odh-model-controller-696fc77849-gg64n\" (UID: \"ae7dc782-b5b4-40b6-9957-e0e724ebdbcd\") " pod="kserve/odh-model-controller-696fc77849-gg64n" Apr 21 10:10:47.845620 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.845537 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwh84\" (UniqueName: \"kubernetes.io/projected/ae7dc782-b5b4-40b6-9957-e0e724ebdbcd-kube-api-access-pwh84\") pod \"odh-model-controller-696fc77849-gg64n\" (UID: \"ae7dc782-b5b4-40b6-9957-e0e724ebdbcd\") " pod="kserve/odh-model-controller-696fc77849-gg64n" Apr 21 10:10:47.845620 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.845553 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5hz6\" (UniqueName: \"kubernetes.io/projected/f70e23cc-b117-459b-9f7d-bfd08eaf9280-kube-api-access-g5hz6\") pod \"model-serving-api-86f7b4b499-mm8cg\" (UID: \"f70e23cc-b117-459b-9f7d-bfd08eaf9280\") " pod="kserve/model-serving-api-86f7b4b499-mm8cg" Apr 21 10:10:47.845620 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:10:47.845568 2567 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 21 10:10:47.845803 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:10:47.845646 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f70e23cc-b117-459b-9f7d-bfd08eaf9280-tls-certs podName:f70e23cc-b117-459b-9f7d-bfd08eaf9280 nodeName:}" failed. No retries permitted until 2026-04-21 10:10:48.345627852 +0000 UTC m=+416.690077729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/f70e23cc-b117-459b-9f7d-bfd08eaf9280-tls-certs") pod "model-serving-api-86f7b4b499-mm8cg" (UID: "f70e23cc-b117-459b-9f7d-bfd08eaf9280") : secret "model-serving-api-tls" not found Apr 21 10:10:47.847889 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.847866 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae7dc782-b5b4-40b6-9957-e0e724ebdbcd-cert\") pod \"odh-model-controller-696fc77849-gg64n\" (UID: \"ae7dc782-b5b4-40b6-9957-e0e724ebdbcd\") " pod="kserve/odh-model-controller-696fc77849-gg64n" Apr 21 10:10:47.854455 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.854429 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwh84\" (UniqueName: \"kubernetes.io/projected/ae7dc782-b5b4-40b6-9957-e0e724ebdbcd-kube-api-access-pwh84\") pod \"odh-model-controller-696fc77849-gg64n\" (UID: \"ae7dc782-b5b4-40b6-9957-e0e724ebdbcd\") " pod="kserve/odh-model-controller-696fc77849-gg64n" Apr 21 10:10:47.854585 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.854471 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5hz6\" (UniqueName: \"kubernetes.io/projected/f70e23cc-b117-459b-9f7d-bfd08eaf9280-kube-api-access-g5hz6\") pod \"model-serving-api-86f7b4b499-mm8cg\" (UID: \"f70e23cc-b117-459b-9f7d-bfd08eaf9280\") " pod="kserve/model-serving-api-86f7b4b499-mm8cg" Apr 21 10:10:47.917139 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:47.917087 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gg64n" Apr 21 10:10:48.034870 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:48.034788 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gg64n"] Apr 21 10:10:48.037386 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:10:48.037357 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae7dc782_b5b4_40b6_9957_e0e724ebdbcd.slice/crio-bcfc44a87fab7c856e6f3117279f29eb1bd20f9a8779d0fdf0a84cdcd8783520 WatchSource:0}: Error finding container bcfc44a87fab7c856e6f3117279f29eb1bd20f9a8779d0fdf0a84cdcd8783520: Status 404 returned error can't find the container with id bcfc44a87fab7c856e6f3117279f29eb1bd20f9a8779d0fdf0a84cdcd8783520 Apr 21 10:10:48.038649 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:48.038627 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:10:48.349371 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:48.349343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f70e23cc-b117-459b-9f7d-bfd08eaf9280-tls-certs\") pod \"model-serving-api-86f7b4b499-mm8cg\" (UID: \"f70e23cc-b117-459b-9f7d-bfd08eaf9280\") " pod="kserve/model-serving-api-86f7b4b499-mm8cg" Apr 21 10:10:48.351632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:48.351613 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f70e23cc-b117-459b-9f7d-bfd08eaf9280-tls-certs\") pod \"model-serving-api-86f7b4b499-mm8cg\" (UID: \"f70e23cc-b117-459b-9f7d-bfd08eaf9280\") " pod="kserve/model-serving-api-86f7b4b499-mm8cg" Apr 21 10:10:48.500576 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:48.500545 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-mm8cg" Apr 21 10:10:48.626859 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:48.626762 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gg64n" event={"ID":"ae7dc782-b5b4-40b6-9957-e0e724ebdbcd","Type":"ContainerStarted","Data":"bcfc44a87fab7c856e6f3117279f29eb1bd20f9a8779d0fdf0a84cdcd8783520"} Apr 21 10:10:48.640312 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:48.640262 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-mm8cg"] Apr 21 10:10:48.643958 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:10:48.643925 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf70e23cc_b117_459b_9f7d_bfd08eaf9280.slice/crio-8cba91e93a60014a9bf7dbed9236d1042c02b6306ea7b9ef1d0c458113770420 WatchSource:0}: Error finding container 8cba91e93a60014a9bf7dbed9236d1042c02b6306ea7b9ef1d0c458113770420: Status 404 returned error can't find the container with id 8cba91e93a60014a9bf7dbed9236d1042c02b6306ea7b9ef1d0c458113770420 Apr 21 10:10:49.632173 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:49.632133 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-mm8cg" event={"ID":"f70e23cc-b117-459b-9f7d-bfd08eaf9280","Type":"ContainerStarted","Data":"8cba91e93a60014a9bf7dbed9236d1042c02b6306ea7b9ef1d0c458113770420"} Apr 21 10:10:51.638961 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:51.638913 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gg64n" event={"ID":"ae7dc782-b5b4-40b6-9957-e0e724ebdbcd","Type":"ContainerStarted","Data":"febe402ca0d5452efa6feea3d6efd9aeeddf127fe47eb02ff43ac2bd4b926029"} Apr 21 10:10:51.639325 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:51.639058 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-gg64n" Apr 21 10:10:51.656977 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:51.656927 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-gg64n" podStartSLOduration=1.123107266 podStartE2EDuration="4.656910531s" podCreationTimestamp="2026-04-21 10:10:47 +0000 UTC" firstStartedPulling="2026-04-21 10:10:48.03875748 +0000 UTC m=+416.383207343" lastFinishedPulling="2026-04-21 10:10:51.57256073 +0000 UTC m=+419.917010608" observedRunningTime="2026-04-21 10:10:51.655386942 +0000 UTC m=+419.999836828" watchObservedRunningTime="2026-04-21 10:10:51.656910531 +0000 UTC m=+420.001360420" Apr 21 10:10:52.643157 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:52.643099 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-mm8cg" event={"ID":"f70e23cc-b117-459b-9f7d-bfd08eaf9280","Type":"ContainerStarted","Data":"a1c3131d6572440a9916f65929544c71a783a87c1739984204347a379fbfd4af"} Apr 21 10:10:52.643539 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:52.643266 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-mm8cg" Apr 21 10:10:52.662298 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:10:52.662255 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-mm8cg" podStartSLOduration=2.686792829 podStartE2EDuration="5.66224275s" podCreationTimestamp="2026-04-21 10:10:47 +0000 UTC" firstStartedPulling="2026-04-21 10:10:48.646220071 +0000 UTC m=+416.990669948" lastFinishedPulling="2026-04-21 10:10:51.621670002 +0000 UTC m=+419.966119869" observedRunningTime="2026-04-21 10:10:52.660542672 +0000 UTC m=+421.004992556" watchObservedRunningTime="2026-04-21 10:10:52.66224275 +0000 UTC m=+421.006692635" Apr 21 10:11:02.645152 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:02.645105 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-gg64n" Apr 21 10:11:03.604046 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:03.604013 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-9hzb5"] Apr 21 10:11:03.607395 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:03.607377 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9hzb5" Apr 21 10:11:03.609706 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:03.609670 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 21 10:11:03.609818 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:03.609807 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-kw59d\"" Apr 21 10:11:03.613236 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:03.613204 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-9hzb5"] Apr 21 10:11:03.650314 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:03.650294 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-mm8cg" Apr 21 10:11:03.677122 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:03.677084 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slxjw\" (UniqueName: \"kubernetes.io/projected/eb58351b-d299-4c51-aabe-7f3625ac6226-kube-api-access-slxjw\") pod \"s3-init-9hzb5\" (UID: \"eb58351b-d299-4c51-aabe-7f3625ac6226\") " pod="kserve/s3-init-9hzb5" Apr 21 10:11:03.778194 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:03.778164 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slxjw\" (UniqueName: \"kubernetes.io/projected/eb58351b-d299-4c51-aabe-7f3625ac6226-kube-api-access-slxjw\") pod \"s3-init-9hzb5\" (UID: \"eb58351b-d299-4c51-aabe-7f3625ac6226\") " pod="kserve/s3-init-9hzb5" Apr 21 10:11:03.786714 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:03.786688 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slxjw\" (UniqueName: \"kubernetes.io/projected/eb58351b-d299-4c51-aabe-7f3625ac6226-kube-api-access-slxjw\") pod \"s3-init-9hzb5\" (UID: \"eb58351b-d299-4c51-aabe-7f3625ac6226\") " pod="kserve/s3-init-9hzb5" Apr 21 10:11:03.928932 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:03.928870 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9hzb5" Apr 21 10:11:04.042956 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:04.042926 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-9hzb5"] Apr 21 10:11:04.046725 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:11:04.046685 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb58351b_d299_4c51_aabe_7f3625ac6226.slice/crio-d904549f390d2e4a4d11a2357c46077a68c1742df6ec58045e14d9da83ecd88c WatchSource:0}: Error finding container d904549f390d2e4a4d11a2357c46077a68c1742df6ec58045e14d9da83ecd88c: Status 404 returned error can't find the container with id d904549f390d2e4a4d11a2357c46077a68c1742df6ec58045e14d9da83ecd88c Apr 21 10:11:04.683262 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:04.683217 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9hzb5" event={"ID":"eb58351b-d299-4c51-aabe-7f3625ac6226","Type":"ContainerStarted","Data":"d904549f390d2e4a4d11a2357c46077a68c1742df6ec58045e14d9da83ecd88c"} Apr 21 10:11:09.705226 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:09.705187 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9hzb5" event={"ID":"eb58351b-d299-4c51-aabe-7f3625ac6226","Type":"ContainerStarted","Data":"3aa72f901af3ce8900bf38ce18e8a25f813d4830b9b070bf68e30ce1b07cd406"} Apr 21 10:11:09.721257 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:09.721205 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-9hzb5" podStartSLOduration=1.909659429 podStartE2EDuration="6.721189745s" podCreationTimestamp="2026-04-21 10:11:03 +0000 UTC" firstStartedPulling="2026-04-21 10:11:04.048629912 +0000 UTC m=+432.393079774" lastFinishedPulling="2026-04-21 10:11:08.860160228 +0000 UTC m=+437.204610090" observedRunningTime="2026-04-21 10:11:09.720221731 +0000 UTC m=+438.064671617" watchObservedRunningTime="2026-04-21 10:11:09.721189745 +0000 UTC m=+438.065639629" Apr 21 10:11:12.717757 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:12.717723 2567 generic.go:358] "Generic (PLEG): container finished" podID="eb58351b-d299-4c51-aabe-7f3625ac6226" containerID="3aa72f901af3ce8900bf38ce18e8a25f813d4830b9b070bf68e30ce1b07cd406" exitCode=0 Apr 21 10:11:12.718132 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:12.717777 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9hzb5" event={"ID":"eb58351b-d299-4c51-aabe-7f3625ac6226","Type":"ContainerDied","Data":"3aa72f901af3ce8900bf38ce18e8a25f813d4830b9b070bf68e30ce1b07cd406"} Apr 21 10:11:13.847315 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:13.847294 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9hzb5" Apr 21 10:11:13.964445 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:13.964416 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slxjw\" (UniqueName: \"kubernetes.io/projected/eb58351b-d299-4c51-aabe-7f3625ac6226-kube-api-access-slxjw\") pod \"eb58351b-d299-4c51-aabe-7f3625ac6226\" (UID: \"eb58351b-d299-4c51-aabe-7f3625ac6226\") " Apr 21 10:11:13.966424 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:13.966395 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb58351b-d299-4c51-aabe-7f3625ac6226-kube-api-access-slxjw" (OuterVolumeSpecName: "kube-api-access-slxjw") pod "eb58351b-d299-4c51-aabe-7f3625ac6226" (UID: "eb58351b-d299-4c51-aabe-7f3625ac6226"). InnerVolumeSpecName "kube-api-access-slxjw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:11:14.065667 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:14.065640 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-slxjw\" (UniqueName: \"kubernetes.io/projected/eb58351b-d299-4c51-aabe-7f3625ac6226-kube-api-access-slxjw\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:11:14.724564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:14.724531 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9hzb5" event={"ID":"eb58351b-d299-4c51-aabe-7f3625ac6226","Type":"ContainerDied","Data":"d904549f390d2e4a4d11a2357c46077a68c1742df6ec58045e14d9da83ecd88c"} Apr 21 10:11:14.724564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:14.724566 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d904549f390d2e4a4d11a2357c46077a68c1742df6ec58045e14d9da83ecd88c" Apr 21 10:11:14.724785 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:14.724544 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9hzb5" Apr 21 10:11:24.845716 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:24.845675 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l"] Apr 21 10:11:24.846140 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:24.846125 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb58351b-d299-4c51-aabe-7f3625ac6226" containerName="s3-init" Apr 21 10:11:24.846188 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:24.846143 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb58351b-d299-4c51-aabe-7f3625ac6226" containerName="s3-init" Apr 21 10:11:24.846240 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:24.846230 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb58351b-d299-4c51-aabe-7f3625ac6226" containerName="s3-init" Apr 21 10:11:24.916529 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:24.916497 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l"] Apr 21 10:11:24.916667 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:24.916627 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" Apr 21 10:11:24.918951 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:24.918925 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5m4mw\"" Apr 21 10:11:24.959516 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:24.959481 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ecbb08d-be76-445c-9258-6b25e9b21cb4-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-hqh7l\" (UID: \"7ecbb08d-be76-445c-9258-6b25e9b21cb4\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" Apr 21 10:11:25.022468 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.022437 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx"] Apr 21 10:11:25.043958 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.043928 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx"] Apr 21 10:11:25.044094 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.044070 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" Apr 21 10:11:25.060494 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.060468 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ecbb08d-be76-445c-9258-6b25e9b21cb4-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-hqh7l\" (UID: \"7ecbb08d-be76-445c-9258-6b25e9b21cb4\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" Apr 21 10:11:25.060799 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.060780 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ecbb08d-be76-445c-9258-6b25e9b21cb4-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-hqh7l\" (UID: \"7ecbb08d-be76-445c-9258-6b25e9b21cb4\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" Apr 21 10:11:25.161551 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.161487 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/073b40f3-91a8-4b2a-9419-e009f64a8680-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx\" (UID: \"073b40f3-91a8-4b2a-9419-e009f64a8680\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" Apr 21 10:11:25.226624 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.226594 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" Apr 21 10:11:25.262829 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.262802 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/073b40f3-91a8-4b2a-9419-e009f64a8680-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx\" (UID: \"073b40f3-91a8-4b2a-9419-e009f64a8680\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" Apr 21 10:11:25.263177 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.263159 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/073b40f3-91a8-4b2a-9419-e009f64a8680-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx\" (UID: \"073b40f3-91a8-4b2a-9419-e009f64a8680\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" Apr 21 10:11:25.344254 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.344164 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l"] Apr 21 10:11:25.346678 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:11:25.346650 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ecbb08d_be76_445c_9258_6b25e9b21cb4.slice/crio-01a65bdd57d07c2879b8e8c3e6fb64b88d4e16af88b9d403e8b207fdb8e230ed WatchSource:0}: Error finding container 01a65bdd57d07c2879b8e8c3e6fb64b88d4e16af88b9d403e8b207fdb8e230ed: Status 404 returned error can't find the container with id 01a65bdd57d07c2879b8e8c3e6fb64b88d4e16af88b9d403e8b207fdb8e230ed Apr 21 10:11:25.355848 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.355830 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" Apr 21 10:11:25.476516 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.476486 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx"] Apr 21 10:11:25.479344 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:11:25.479317 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod073b40f3_91a8_4b2a_9419_e009f64a8680.slice/crio-2e66737eedde381b41374b69b9db4273d7b23c7e93a6e3e506542d23e1f235bc WatchSource:0}: Error finding container 2e66737eedde381b41374b69b9db4273d7b23c7e93a6e3e506542d23e1f235bc: Status 404 returned error can't find the container with id 2e66737eedde381b41374b69b9db4273d7b23c7e93a6e3e506542d23e1f235bc Apr 21 10:11:25.758868 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.758778 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" event={"ID":"073b40f3-91a8-4b2a-9419-e009f64a8680","Type":"ContainerStarted","Data":"2e66737eedde381b41374b69b9db4273d7b23c7e93a6e3e506542d23e1f235bc"} Apr 21 10:11:25.759882 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:25.759856 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" event={"ID":"7ecbb08d-be76-445c-9258-6b25e9b21cb4","Type":"ContainerStarted","Data":"01a65bdd57d07c2879b8e8c3e6fb64b88d4e16af88b9d403e8b207fdb8e230ed"} Apr 21 10:11:29.773917 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:29.773881 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" event={"ID":"073b40f3-91a8-4b2a-9419-e009f64a8680","Type":"ContainerStarted","Data":"48be88be561ba97386e2e5d99920e9933114e9d1f9cfd6e9425eeba1692f0b0e"} Apr 21 10:11:29.775125 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:29.775089 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" event={"ID":"7ecbb08d-be76-445c-9258-6b25e9b21cb4","Type":"ContainerStarted","Data":"3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74"} Apr 21 10:11:32.784835 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:32.784803 2567 generic.go:358] "Generic (PLEG): container finished" podID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerID="3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74" exitCode=0 Apr 21 10:11:32.785207 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:32.784868 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" event={"ID":"7ecbb08d-be76-445c-9258-6b25e9b21cb4","Type":"ContainerDied","Data":"3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74"} Apr 21 10:11:33.789539 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:33.789499 2567 generic.go:358] "Generic (PLEG): container finished" podID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerID="48be88be561ba97386e2e5d99920e9933114e9d1f9cfd6e9425eeba1692f0b0e" exitCode=0 Apr 21 10:11:33.790094 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:33.789565 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" event={"ID":"073b40f3-91a8-4b2a-9419-e009f64a8680","Type":"ContainerDied","Data":"48be88be561ba97386e2e5d99920e9933114e9d1f9cfd6e9425eeba1692f0b0e"} Apr 21 10:11:58.880680 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:58.880636 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" event={"ID":"7ecbb08d-be76-445c-9258-6b25e9b21cb4","Type":"ContainerStarted","Data":"624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62"} Apr 21 10:11:58.882253 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:58.882229 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" event={"ID":"073b40f3-91a8-4b2a-9419-e009f64a8680","Type":"ContainerStarted","Data":"cac89c33a1ba202bc3a4e02b63bf955f3e86fe40070261592d57e280b1dfe356"} Apr 21 10:11:58.882592 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:58.882570 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" Apr 21 10:11:58.883770 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:58.883730 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 21 10:11:58.900874 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:58.900822 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" podStartSLOduration=0.921068669 podStartE2EDuration="33.900809439s" podCreationTimestamp="2026-04-21 10:11:25 +0000 UTC" firstStartedPulling="2026-04-21 10:11:25.481146242 +0000 UTC m=+453.825596104" lastFinishedPulling="2026-04-21 10:11:58.460887011 +0000 UTC m=+486.805336874" observedRunningTime="2026-04-21 10:11:58.900548557 +0000 UTC m=+487.244998442" watchObservedRunningTime="2026-04-21 10:11:58.900809439 +0000 UTC m=+487.245259326" Apr 21 10:11:59.887283 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:59.887242 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 21 10:11:59.906535 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:11:59.906485 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" podStartSLOduration=2.461416078 podStartE2EDuration="35.906471697s" podCreationTimestamp="2026-04-21 10:11:24 +0000 UTC" firstStartedPulling="2026-04-21 10:11:25.348369344 +0000 UTC m=+453.692819208" lastFinishedPulling="2026-04-21 10:11:58.793424965 +0000 UTC m=+487.137874827" observedRunningTime="2026-04-21 10:11:59.905263271 +0000 UTC m=+488.249713158" watchObservedRunningTime="2026-04-21 10:11:59.906471697 +0000 UTC m=+488.250921582" Apr 21 10:12:09.887473 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:09.887437 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" Apr 21 10:12:09.887957 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:09.887528 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 21 10:12:09.888781 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:09.888756 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 21 10:12:09.888967 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:09.888944 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 21 10:12:19.887632 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:19.887588 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 21 10:12:19.889939 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:19.889901 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 21 10:12:29.887635 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:29.887594 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 21 10:12:29.889910 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:29.889885 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 21 10:12:39.888067 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:39.887982 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 21 10:12:39.889248 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:39.889223 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 21 10:12:44.479409 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:44.479376 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn"] Apr 21 10:12:44.482241 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:44.482224 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:44.485680 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:44.485657 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-ce2f4-serving-cert\"" Apr 21 10:12:44.485787 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:44.485656 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-ce2f4-kube-rbac-proxy-sar-config\"" Apr 21 10:12:44.485787 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:44.485734 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 10:12:44.492133 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:44.492095 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn"] Apr 21 10:12:44.559354 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:44.559323 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls\") pod \"switch-graph-ce2f4-779b99dd69-zgpqn\" (UID: \"22fda395-1783-4084-9889-ff39f1ab9392\") " pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:44.559490 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:44.559455 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22fda395-1783-4084-9889-ff39f1ab9392-openshift-service-ca-bundle\") pod \"switch-graph-ce2f4-779b99dd69-zgpqn\" (UID: \"22fda395-1783-4084-9889-ff39f1ab9392\") " pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:44.660741 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:44.660708 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22fda395-1783-4084-9889-ff39f1ab9392-openshift-service-ca-bundle\") pod \"switch-graph-ce2f4-779b99dd69-zgpqn\" (UID: \"22fda395-1783-4084-9889-ff39f1ab9392\") " pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:44.660856 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:44.660764 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls\") pod \"switch-graph-ce2f4-779b99dd69-zgpqn\" (UID: \"22fda395-1783-4084-9889-ff39f1ab9392\") " pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:44.660899 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:12:44.660860 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-ce2f4-serving-cert: secret "switch-graph-ce2f4-serving-cert" not found Apr 21 10:12:44.660942 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:12:44.660933 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls podName:22fda395-1783-4084-9889-ff39f1ab9392 nodeName:}" failed. No retries permitted until 2026-04-21 10:12:45.160917276 +0000 UTC m=+533.505367139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls") pod "switch-graph-ce2f4-779b99dd69-zgpqn" (UID: "22fda395-1783-4084-9889-ff39f1ab9392") : secret "switch-graph-ce2f4-serving-cert" not found Apr 21 10:12:44.661392 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:44.661374 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22fda395-1783-4084-9889-ff39f1ab9392-openshift-service-ca-bundle\") pod \"switch-graph-ce2f4-779b99dd69-zgpqn\" (UID: \"22fda395-1783-4084-9889-ff39f1ab9392\") " pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:45.163769 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:45.163735 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls\") pod \"switch-graph-ce2f4-779b99dd69-zgpqn\" (UID: \"22fda395-1783-4084-9889-ff39f1ab9392\") " pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:45.163947 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:12:45.163893 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-ce2f4-serving-cert: secret "switch-graph-ce2f4-serving-cert" not found Apr 21 10:12:45.163998 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:12:45.163953 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls podName:22fda395-1783-4084-9889-ff39f1ab9392 nodeName:}" failed. No retries permitted until 2026-04-21 10:12:46.163938627 +0000 UTC m=+534.508388489 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls") pod "switch-graph-ce2f4-779b99dd69-zgpqn" (UID: "22fda395-1783-4084-9889-ff39f1ab9392") : secret "switch-graph-ce2f4-serving-cert" not found Apr 21 10:12:46.171973 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:46.171920 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls\") pod \"switch-graph-ce2f4-779b99dd69-zgpqn\" (UID: \"22fda395-1783-4084-9889-ff39f1ab9392\") " pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:46.174231 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:46.174210 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls\") pod \"switch-graph-ce2f4-779b99dd69-zgpqn\" (UID: \"22fda395-1783-4084-9889-ff39f1ab9392\") " pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:46.293435 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:46.293388 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:46.433420 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:46.433346 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn"] Apr 21 10:12:46.437214 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:12:46.437181 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22fda395_1783_4084_9889_ff39f1ab9392.slice/crio-e50625ad13300efc8a71d0bbf6fb203c5c7be2b9b8d12bc50b46c9214511b158 WatchSource:0}: Error finding container e50625ad13300efc8a71d0bbf6fb203c5c7be2b9b8d12bc50b46c9214511b158: Status 404 returned error can't find the container with id e50625ad13300efc8a71d0bbf6fb203c5c7be2b9b8d12bc50b46c9214511b158 Apr 21 10:12:47.023080 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:47.023042 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" event={"ID":"22fda395-1783-4084-9889-ff39f1ab9392","Type":"ContainerStarted","Data":"e50625ad13300efc8a71d0bbf6fb203c5c7be2b9b8d12bc50b46c9214511b158"} Apr 21 10:12:49.029844 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:49.029809 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" event={"ID":"22fda395-1783-4084-9889-ff39f1ab9392","Type":"ContainerStarted","Data":"17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472"} Apr 21 10:12:49.030276 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:49.029919 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:49.050081 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:49.050031 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" podStartSLOduration=2.651958221 podStartE2EDuration="5.050019738s" podCreationTimestamp="2026-04-21 10:12:44 +0000 UTC" firstStartedPulling="2026-04-21 10:12:46.43898433 +0000 UTC m=+534.783434198" lastFinishedPulling="2026-04-21 10:12:48.837045853 +0000 UTC m=+537.181495715" observedRunningTime="2026-04-21 10:12:49.048544482 +0000 UTC m=+537.392994378" watchObservedRunningTime="2026-04-21 10:12:49.050019738 +0000 UTC m=+537.394469622" Apr 21 10:12:49.887337 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:49.887295 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 21 10:12:49.889615 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:49.889594 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 21 10:12:55.038029 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:55.038001 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:12:58.705336 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:58.705296 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn"] Apr 21 10:12:58.705699 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:58.705531 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" podUID="22fda395-1783-4084-9889-ff39f1ab9392" containerName="switch-graph-ce2f4" containerID="cri-o://17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472" gracePeriod=30 Apr 21 10:12:59.888124 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:59.888079 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:8080: connect: connection refused" Apr 21 10:12:59.890307 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:12:59.890284 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" Apr 21 10:13:00.036581 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:00.036544 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" podUID="22fda395-1783-4084-9889-ff39f1ab9392" containerName="switch-graph-ce2f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:03.253907 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:03.253881 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" Apr 21 10:13:05.036747 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:05.036707 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" podUID="22fda395-1783-4084-9889-ff39f1ab9392" containerName="switch-graph-ce2f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:10.036951 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:10.036915 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" podUID="22fda395-1783-4084-9889-ff39f1ab9392" containerName="switch-graph-ce2f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:10.037368 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:10.037013 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:13:14.514756 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.514718 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6"] Apr 21 10:13:14.518075 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.518058 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:14.520472 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.520448 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 21 10:13:14.520472 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.520451 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 21 10:13:14.525590 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.525568 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6"] Apr 21 10:13:14.597934 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.597898 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251ce9c-c0be-431f-969b-97130ae2cfb3-openshift-service-ca-bundle\") pod \"model-chainer-797489c4d6-rpwb6\" (UID: \"8251ce9c-c0be-431f-969b-97130ae2cfb3\") " pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:14.598094 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.597964 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8251ce9c-c0be-431f-969b-97130ae2cfb3-proxy-tls\") pod \"model-chainer-797489c4d6-rpwb6\" (UID: \"8251ce9c-c0be-431f-969b-97130ae2cfb3\") " pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:14.698931 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.698903 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251ce9c-c0be-431f-969b-97130ae2cfb3-openshift-service-ca-bundle\") pod \"model-chainer-797489c4d6-rpwb6\" (UID: \"8251ce9c-c0be-431f-969b-97130ae2cfb3\") " pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:14.699093 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.698952 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8251ce9c-c0be-431f-969b-97130ae2cfb3-proxy-tls\") pod \"model-chainer-797489c4d6-rpwb6\" (UID: \"8251ce9c-c0be-431f-969b-97130ae2cfb3\") " pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:14.699611 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.699588 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251ce9c-c0be-431f-969b-97130ae2cfb3-openshift-service-ca-bundle\") pod \"model-chainer-797489c4d6-rpwb6\" (UID: \"8251ce9c-c0be-431f-969b-97130ae2cfb3\") " pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:14.701239 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.701220 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8251ce9c-c0be-431f-969b-97130ae2cfb3-proxy-tls\") pod \"model-chainer-797489c4d6-rpwb6\" (UID: \"8251ce9c-c0be-431f-969b-97130ae2cfb3\") " pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:14.828261 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.828238 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:14.949456 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:14.949424 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6"] Apr 21 10:13:14.952285 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:13:14.952256 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8251ce9c_c0be_431f_969b_97130ae2cfb3.slice/crio-3db2774ea0253555aa67652469db06b6c5ef88b6ba3bc85fb45a832e0037c412 WatchSource:0}: Error finding container 3db2774ea0253555aa67652469db06b6c5ef88b6ba3bc85fb45a832e0037c412: Status 404 returned error can't find the container with id 3db2774ea0253555aa67652469db06b6c5ef88b6ba3bc85fb45a832e0037c412 Apr 21 10:13:15.037283 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:15.037254 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" podUID="22fda395-1783-4084-9889-ff39f1ab9392" containerName="switch-graph-ce2f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:15.109689 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:15.109615 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" event={"ID":"8251ce9c-c0be-431f-969b-97130ae2cfb3","Type":"ContainerStarted","Data":"4f133ba42570a90be6b5a67c9ce6df55611eb78f0bd6fb1fb0dd73f1b7c53d9a"} Apr 21 10:13:15.109689 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:15.109652 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" event={"ID":"8251ce9c-c0be-431f-969b-97130ae2cfb3","Type":"ContainerStarted","Data":"3db2774ea0253555aa67652469db06b6c5ef88b6ba3bc85fb45a832e0037c412"} Apr 21 10:13:15.109853 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:15.109743 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:15.126157 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:15.126104 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" podStartSLOduration=1.12609181 podStartE2EDuration="1.12609181s" podCreationTimestamp="2026-04-21 10:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:13:15.125135372 +0000 UTC m=+563.469585258" watchObservedRunningTime="2026-04-21 10:13:15.12609181 +0000 UTC m=+563.470541695" Apr 21 10:13:20.036892 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:20.036848 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" podUID="22fda395-1783-4084-9889-ff39f1ab9392" containerName="switch-graph-ce2f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:21.120063 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:21.120031 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:24.621948 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:24.621912 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6"] Apr 21 10:13:24.622365 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:24.622188 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" podUID="8251ce9c-c0be-431f-969b-97130ae2cfb3" containerName="model-chainer" containerID="cri-o://4f133ba42570a90be6b5a67c9ce6df55611eb78f0bd6fb1fb0dd73f1b7c53d9a" gracePeriod=30 Apr 21 10:13:24.834243 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:24.834208 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx"] Apr 21 10:13:24.834504 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:24.834464 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" containerID="cri-o://cac89c33a1ba202bc3a4e02b63bf955f3e86fe40070261592d57e280b1dfe356" gracePeriod=30 Apr 21 10:13:24.989525 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:24.989452 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l"] Apr 21 10:13:24.989791 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:24.989737 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="kserve-container" containerID="cri-o://624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62" gracePeriod=30 Apr 21 10:13:25.037463 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:25.037428 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" podUID="22fda395-1783-4084-9889-ff39f1ab9392" containerName="switch-graph-ce2f4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:26.117596 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:26.117551 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" podUID="8251ce9c-c0be-431f-969b-97130ae2cfb3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:28.628736 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:28.628710 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" Apr 21 10:13:28.720762 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:28.719994 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ecbb08d-be76-445c-9258-6b25e9b21cb4-kserve-provision-location\") pod \"7ecbb08d-be76-445c-9258-6b25e9b21cb4\" (UID: \"7ecbb08d-be76-445c-9258-6b25e9b21cb4\") " Apr 21 10:13:28.720946 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:28.720924 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ecbb08d-be76-445c-9258-6b25e9b21cb4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7ecbb08d-be76-445c-9258-6b25e9b21cb4" (UID: "7ecbb08d-be76-445c-9258-6b25e9b21cb4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:13:28.821453 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:28.821425 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7ecbb08d-be76-445c-9258-6b25e9b21cb4-kserve-provision-location\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:13:28.840966 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:28.840947 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:13:29.022808 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.022729 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22fda395-1783-4084-9889-ff39f1ab9392-openshift-service-ca-bundle\") pod \"22fda395-1783-4084-9889-ff39f1ab9392\" (UID: \"22fda395-1783-4084-9889-ff39f1ab9392\") " Apr 21 10:13:29.022808 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.022790 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls\") pod \"22fda395-1783-4084-9889-ff39f1ab9392\" (UID: \"22fda395-1783-4084-9889-ff39f1ab9392\") " Apr 21 10:13:29.023099 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.023049 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22fda395-1783-4084-9889-ff39f1ab9392-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "22fda395-1783-4084-9889-ff39f1ab9392" (UID: "22fda395-1783-4084-9889-ff39f1ab9392"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:13:29.024885 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.024863 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "22fda395-1783-4084-9889-ff39f1ab9392" (UID: "22fda395-1783-4084-9889-ff39f1ab9392"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:13:29.123672 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.123640 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22fda395-1783-4084-9889-ff39f1ab9392-openshift-service-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:13:29.123672 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.123668 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/22fda395-1783-4084-9889-ff39f1ab9392-proxy-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:13:29.155916 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.155888 2567 generic.go:358] "Generic (PLEG): container finished" podID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerID="cac89c33a1ba202bc3a4e02b63bf955f3e86fe40070261592d57e280b1dfe356" exitCode=0 Apr 21 10:13:29.156031 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.155959 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" event={"ID":"073b40f3-91a8-4b2a-9419-e009f64a8680","Type":"ContainerDied","Data":"cac89c33a1ba202bc3a4e02b63bf955f3e86fe40070261592d57e280b1dfe356"} Apr 21 10:13:29.157136 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.157095 2567 generic.go:358] "Generic (PLEG): container finished" podID="22fda395-1783-4084-9889-ff39f1ab9392" containerID="17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472" exitCode=0 Apr 21 10:13:29.157224 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.157141 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" event={"ID":"22fda395-1783-4084-9889-ff39f1ab9392","Type":"ContainerDied","Data":"17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472"} Apr 21 10:13:29.157224 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.157172 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" event={"ID":"22fda395-1783-4084-9889-ff39f1ab9392","Type":"ContainerDied","Data":"e50625ad13300efc8a71d0bbf6fb203c5c7be2b9b8d12bc50b46c9214511b158"} Apr 21 10:13:29.157224 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.157178 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn" Apr 21 10:13:29.157224 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.157194 2567 scope.go:117] "RemoveContainer" containerID="17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472" Apr 21 10:13:29.157564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.157544 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" Apr 21 10:13:29.158614 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.158592 2567 generic.go:358] "Generic (PLEG): container finished" podID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerID="624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62" exitCode=0 Apr 21 10:13:29.158757 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.158647 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" Apr 21 10:13:29.158757 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.158652 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" event={"ID":"7ecbb08d-be76-445c-9258-6b25e9b21cb4","Type":"ContainerDied","Data":"624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62"} Apr 21 10:13:29.158757 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.158677 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l" event={"ID":"7ecbb08d-be76-445c-9258-6b25e9b21cb4","Type":"ContainerDied","Data":"01a65bdd57d07c2879b8e8c3e6fb64b88d4e16af88b9d403e8b207fdb8e230ed"} Apr 21 10:13:29.164749 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.164734 2567 scope.go:117] "RemoveContainer" containerID="17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472" Apr 21 10:13:29.164994 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:13:29.164977 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472\": container with ID starting with 17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472 not found: ID does not exist" containerID="17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472" Apr 21 10:13:29.165055 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.165000 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472"} err="failed to get container status \"17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472\": rpc error: code = NotFound desc = could not find container \"17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472\": container with ID starting with 17d4dd4e71684d73f14666f04b31da391f37a76fce844fa7a47a868664158472 not found: ID does not exist" Apr 21 10:13:29.165055 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.165018 2567 scope.go:117] "RemoveContainer" containerID="624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62" Apr 21 10:13:29.172245 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.172229 2567 scope.go:117] "RemoveContainer" containerID="3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74" Apr 21 10:13:29.180331 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.180311 2567 scope.go:117] "RemoveContainer" containerID="624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62" Apr 21 10:13:29.180624 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:13:29.180600 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62\": container with ID starting with 624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62 not found: ID does not exist" containerID="624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62" Apr 21 10:13:29.180682 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.180635 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62"} err="failed to get container status \"624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62\": rpc error: code = NotFound desc = could not find container \"624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62\": container with ID starting with 624b342c24c61e1810167e7155907780b6bb20d98a4ce8025556e5ef825abf62 not found: ID does not exist" Apr 21 10:13:29.180682 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.180657 2567 scope.go:117] "RemoveContainer" containerID="3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74" Apr 21 10:13:29.180939 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:13:29.180919 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74\": container with ID starting with 3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74 not found: ID does not exist" containerID="3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74" Apr 21 10:13:29.180993 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.180943 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74"} err="failed to get container status \"3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74\": rpc error: code = NotFound desc = could not find container \"3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74\": container with ID starting with 3cc5fda9d4d3974a2efe56972c49811afdd098079165104f522bb1f20516ea74 not found: ID does not exist" Apr 21 10:13:29.192479 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.192452 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l"] Apr 21 10:13:29.196762 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.196739 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-hqh7l"] Apr 21 10:13:29.205781 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.205759 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn"] Apr 21 10:13:29.207656 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.207636 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-ce2f4-779b99dd69-zgpqn"] Apr 21 10:13:29.324903 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.324884 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/073b40f3-91a8-4b2a-9419-e009f64a8680-kserve-provision-location\") pod \"073b40f3-91a8-4b2a-9419-e009f64a8680\" (UID: \"073b40f3-91a8-4b2a-9419-e009f64a8680\") " Apr 21 10:13:29.325195 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.325169 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/073b40f3-91a8-4b2a-9419-e009f64a8680-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "073b40f3-91a8-4b2a-9419-e009f64a8680" (UID: "073b40f3-91a8-4b2a-9419-e009f64a8680"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:13:29.426156 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:29.426137 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/073b40f3-91a8-4b2a-9419-e009f64a8680-kserve-provision-location\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:13:30.164384 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:30.164348 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" event={"ID":"073b40f3-91a8-4b2a-9419-e009f64a8680","Type":"ContainerDied","Data":"2e66737eedde381b41374b69b9db4273d7b23c7e93a6e3e506542d23e1f235bc"} Apr 21 10:13:30.164785 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:30.164395 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx" Apr 21 10:13:30.164785 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:30.164396 2567 scope.go:117] "RemoveContainer" containerID="cac89c33a1ba202bc3a4e02b63bf955f3e86fe40070261592d57e280b1dfe356" Apr 21 10:13:30.172638 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:30.172619 2567 scope.go:117] "RemoveContainer" containerID="48be88be561ba97386e2e5d99920e9933114e9d1f9cfd6e9425eeba1692f0b0e" Apr 21 10:13:30.184586 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:30.184565 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx"] Apr 21 10:13:30.186032 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:30.186012 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7459dc5d7-kfblx"] Apr 21 10:13:30.256399 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:30.256376 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" path="/var/lib/kubelet/pods/073b40f3-91a8-4b2a-9419-e009f64a8680/volumes" Apr 21 10:13:30.256748 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:30.256735 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22fda395-1783-4084-9889-ff39f1ab9392" path="/var/lib/kubelet/pods/22fda395-1783-4084-9889-ff39f1ab9392/volumes" Apr 21 10:13:30.257010 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:30.256999 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" path="/var/lib/kubelet/pods/7ecbb08d-be76-445c-9258-6b25e9b21cb4/volumes" Apr 21 10:13:31.117372 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:31.117333 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" podUID="8251ce9c-c0be-431f-969b-97130ae2cfb3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:36.116888 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:36.116848 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" podUID="8251ce9c-c0be-431f-969b-97130ae2cfb3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:36.117350 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:36.116961 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:41.117919 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:41.117874 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" podUID="8251ce9c-c0be-431f-969b-97130ae2cfb3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:46.116799 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:46.116759 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" podUID="8251ce9c-c0be-431f-969b-97130ae2cfb3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:51.116964 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:51.116929 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" podUID="8251ce9c-c0be-431f-969b-97130ae2cfb3" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:13:52.136092 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:52.136066 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:13:52.138024 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:52.138003 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:13:54.635495 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:13:54.635470 2567 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8251ce9c_c0be_431f_969b_97130ae2cfb3.slice/crio-4f133ba42570a90be6b5a67c9ce6df55611eb78f0bd6fb1fb0dd73f1b7c53d9a.scope/pids.max": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8251ce9c_c0be_431f_969b_97130ae2cfb3.slice/crio-4f133ba42570a90be6b5a67c9ce6df55611eb78f0bd6fb1fb0dd73f1b7c53d9a.scope/pids.max: no such device Apr 21 10:13:54.662608 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:13:54.662583 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8251ce9c_c0be_431f_969b_97130ae2cfb3.slice/crio-conmon-4f133ba42570a90be6b5a67c9ce6df55611eb78f0bd6fb1fb0dd73f1b7c53d9a.scope\": RecentStats: unable to find data in memory cache]" Apr 21 10:13:55.237153 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:55.237102 2567 generic.go:358] "Generic (PLEG): container finished" podID="8251ce9c-c0be-431f-969b-97130ae2cfb3" containerID="4f133ba42570a90be6b5a67c9ce6df55611eb78f0bd6fb1fb0dd73f1b7c53d9a" exitCode=0 Apr 21 10:13:55.237291 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:55.237170 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" event={"ID":"8251ce9c-c0be-431f-969b-97130ae2cfb3","Type":"ContainerDied","Data":"4f133ba42570a90be6b5a67c9ce6df55611eb78f0bd6fb1fb0dd73f1b7c53d9a"} Apr 21 10:13:55.261047 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:55.261029 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:55.320866 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:55.320840 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8251ce9c-c0be-431f-969b-97130ae2cfb3-proxy-tls\") pod \"8251ce9c-c0be-431f-969b-97130ae2cfb3\" (UID: \"8251ce9c-c0be-431f-969b-97130ae2cfb3\") " Apr 21 10:13:55.320984 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:55.320876 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251ce9c-c0be-431f-969b-97130ae2cfb3-openshift-service-ca-bundle\") pod \"8251ce9c-c0be-431f-969b-97130ae2cfb3\" (UID: \"8251ce9c-c0be-431f-969b-97130ae2cfb3\") " Apr 21 10:13:55.321283 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:55.321263 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8251ce9c-c0be-431f-969b-97130ae2cfb3-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8251ce9c-c0be-431f-969b-97130ae2cfb3" (UID: "8251ce9c-c0be-431f-969b-97130ae2cfb3"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:13:55.322759 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:55.322738 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8251ce9c-c0be-431f-969b-97130ae2cfb3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8251ce9c-c0be-431f-969b-97130ae2cfb3" (UID: "8251ce9c-c0be-431f-969b-97130ae2cfb3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:13:55.421848 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:55.421775 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8251ce9c-c0be-431f-969b-97130ae2cfb3-proxy-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:13:55.421848 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:55.421800 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8251ce9c-c0be-431f-969b-97130ae2cfb3-openshift-service-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:13:56.243539 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:56.243499 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" event={"ID":"8251ce9c-c0be-431f-969b-97130ae2cfb3","Type":"ContainerDied","Data":"3db2774ea0253555aa67652469db06b6c5ef88b6ba3bc85fb45a832e0037c412"} Apr 21 10:13:56.243539 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:56.243523 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6" Apr 21 10:13:56.243985 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:56.243546 2567 scope.go:117] "RemoveContainer" containerID="4f133ba42570a90be6b5a67c9ce6df55611eb78f0bd6fb1fb0dd73f1b7c53d9a" Apr 21 10:13:56.263445 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:56.263420 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6"] Apr 21 10:13:56.267059 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:56.267034 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-797489c4d6-rpwb6"] Apr 21 10:13:58.256505 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:13:58.256473 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8251ce9c-c0be-431f-969b-97130ae2cfb3" path="/var/lib/kubelet/pods/8251ce9c-c0be-431f-969b-97130ae2cfb3/volumes" Apr 21 10:14:08.958880 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.958803 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7"] Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959096 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="storage-initializer" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959106 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="storage-initializer" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959129 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8251ce9c-c0be-431f-969b-97130ae2cfb3" containerName="model-chainer" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959135 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8251ce9c-c0be-431f-969b-97130ae2cfb3" containerName="model-chainer" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959143 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="kserve-container" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959151 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="kserve-container" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959170 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959176 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959184 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="storage-initializer" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959188 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="storage-initializer" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959197 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22fda395-1783-4084-9889-ff39f1ab9392" containerName="switch-graph-ce2f4" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959201 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="22fda395-1783-4084-9889-ff39f1ab9392" containerName="switch-graph-ce2f4" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959244 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8251ce9c-c0be-431f-969b-97130ae2cfb3" containerName="model-chainer" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959251 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ecbb08d-be76-445c-9258-6b25e9b21cb4" containerName="kserve-container" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959256 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="073b40f3-91a8-4b2a-9419-e009f64a8680" containerName="kserve-container" Apr 21 10:14:08.959330 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.959264 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="22fda395-1783-4084-9889-ff39f1ab9392" containerName="switch-graph-ce2f4" Apr 21 10:14:08.963397 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.963381 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:14:08.965918 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.965889 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b4346-serving-cert\"" Apr 21 10:14:08.966270 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.966252 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 10:14:08.966359 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.966276 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5m4mw\"" Apr 21 10:14:08.966359 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.966281 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b4346-kube-rbac-proxy-sar-config\"" Apr 21 10:14:08.968895 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:08.968871 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7"] Apr 21 10:14:09.034008 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:09.033977 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-openshift-service-ca-bundle\") pod \"switch-graph-b4346-6b777556b4-xkcl7\" (UID: \"c6fd55cb-e9cd-4e01-952c-7d571a1efca7\") " pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:14:09.034182 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:09.034014 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-proxy-tls\") pod \"switch-graph-b4346-6b777556b4-xkcl7\" (UID: \"c6fd55cb-e9cd-4e01-952c-7d571a1efca7\") " pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:14:09.134960 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:09.134930 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-openshift-service-ca-bundle\") pod \"switch-graph-b4346-6b777556b4-xkcl7\" (UID: \"c6fd55cb-e9cd-4e01-952c-7d571a1efca7\") " pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:14:09.134960 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:09.134963 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-proxy-tls\") pod \"switch-graph-b4346-6b777556b4-xkcl7\" (UID: \"c6fd55cb-e9cd-4e01-952c-7d571a1efca7\") " pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:14:09.135174 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:14:09.135082 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-b4346-serving-cert: secret "switch-graph-b4346-serving-cert" not found Apr 21 10:14:09.135174 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:14:09.135169 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-proxy-tls podName:c6fd55cb-e9cd-4e01-952c-7d571a1efca7 nodeName:}" failed. No retries permitted until 2026-04-21 10:14:09.635145415 +0000 UTC m=+617.979595279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-proxy-tls") pod "switch-graph-b4346-6b777556b4-xkcl7" (UID: "c6fd55cb-e9cd-4e01-952c-7d571a1efca7") : secret "switch-graph-b4346-serving-cert" not found Apr 21 10:14:09.135540 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:09.135523 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-openshift-service-ca-bundle\") pod \"switch-graph-b4346-6b777556b4-xkcl7\" (UID: \"c6fd55cb-e9cd-4e01-952c-7d571a1efca7\") " pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:14:09.639693 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:09.639654 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-proxy-tls\") pod \"switch-graph-b4346-6b777556b4-xkcl7\" (UID: \"c6fd55cb-e9cd-4e01-952c-7d571a1efca7\") " pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:14:09.641953 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:09.641923 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-proxy-tls\") pod \"switch-graph-b4346-6b777556b4-xkcl7\" (UID: \"c6fd55cb-e9cd-4e01-952c-7d571a1efca7\") " pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:14:09.873645 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:09.873620 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:14:09.992059 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:09.992027 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7"] Apr 21 10:14:09.995292 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:14:09.995265 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6fd55cb_e9cd_4e01_952c_7d571a1efca7.slice/crio-eee19b3b831a700c97fffc484cab9a4f1643fe501c5390aab603c524fc39afe6 WatchSource:0}: Error finding container eee19b3b831a700c97fffc484cab9a4f1643fe501c5390aab603c524fc39afe6: Status 404 returned error can't find the container with id eee19b3b831a700c97fffc484cab9a4f1643fe501c5390aab603c524fc39afe6 Apr 21 10:14:10.280710 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:10.280682 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" event={"ID":"c6fd55cb-e9cd-4e01-952c-7d571a1efca7","Type":"ContainerStarted","Data":"ce11aeaff9531833cba1512b3d92fde97910881dd01a4652d89a6a005ba4072a"} Apr 21 10:14:10.280854 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:10.280715 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" event={"ID":"c6fd55cb-e9cd-4e01-952c-7d571a1efca7","Type":"ContainerStarted","Data":"eee19b3b831a700c97fffc484cab9a4f1643fe501c5390aab603c524fc39afe6"} Apr 21 10:14:10.280854 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:10.280824 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:14:10.297410 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:10.297367 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" podStartSLOduration=2.297353313 podStartE2EDuration="2.297353313s" podCreationTimestamp="2026-04-21 10:14:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:14:10.295660337 +0000 UTC m=+618.640110221" watchObservedRunningTime="2026-04-21 10:14:10.297353313 +0000 UTC m=+618.641803197" Apr 21 10:14:16.288619 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:16.288588 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:14:34.804645 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:34.804603 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p"] Apr 21 10:14:34.807945 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:34.807927 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:14:34.810355 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:34.810336 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-cd9c7-serving-cert\"" Apr 21 10:14:34.810490 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:34.810473 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-cd9c7-kube-rbac-proxy-sar-config\"" Apr 21 10:14:34.816870 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:34.816841 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p"] Apr 21 10:14:34.837230 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:34.837206 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-openshift-service-ca-bundle\") pod \"sequence-graph-cd9c7-b856cb664-jsx2p\" (UID: \"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8\") " pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:14:34.837323 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:34.837307 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-proxy-tls\") pod \"sequence-graph-cd9c7-b856cb664-jsx2p\" (UID: \"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8\") " pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:14:34.938233 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:34.938211 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-proxy-tls\") pod \"sequence-graph-cd9c7-b856cb664-jsx2p\" (UID: \"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8\") " pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:14:34.938352 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:34.938243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-openshift-service-ca-bundle\") pod \"sequence-graph-cd9c7-b856cb664-jsx2p\" (UID: \"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8\") " pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:14:34.938397 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:14:34.938360 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-cd9c7-serving-cert: secret "sequence-graph-cd9c7-serving-cert" not found Apr 21 10:14:34.938440 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:14:34.938430 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-proxy-tls podName:08cc312b-fc20-4c9b-8df3-b42f2a0b71f8 nodeName:}" failed. No retries permitted until 2026-04-21 10:14:35.438413738 +0000 UTC m=+643.782863606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-proxy-tls") pod "sequence-graph-cd9c7-b856cb664-jsx2p" (UID: "08cc312b-fc20-4c9b-8df3-b42f2a0b71f8") : secret "sequence-graph-cd9c7-serving-cert" not found Apr 21 10:14:34.938805 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:34.938789 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-openshift-service-ca-bundle\") pod \"sequence-graph-cd9c7-b856cb664-jsx2p\" (UID: \"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8\") " pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:14:35.441859 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:35.441828 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-proxy-tls\") pod \"sequence-graph-cd9c7-b856cb664-jsx2p\" (UID: \"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8\") " pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:14:35.444132 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:35.444095 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-proxy-tls\") pod \"sequence-graph-cd9c7-b856cb664-jsx2p\" (UID: \"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8\") " pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:14:35.719150 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:35.719052 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:14:35.834331 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:35.834241 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p"] Apr 21 10:14:35.836910 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:14:35.836881 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08cc312b_fc20_4c9b_8df3_b42f2a0b71f8.slice/crio-b2f56fb30e36ef1bf0dcdf5ddd186b38ba5692de0f52ecff04680acc2ca8f4e7 WatchSource:0}: Error finding container b2f56fb30e36ef1bf0dcdf5ddd186b38ba5692de0f52ecff04680acc2ca8f4e7: Status 404 returned error can't find the container with id b2f56fb30e36ef1bf0dcdf5ddd186b38ba5692de0f52ecff04680acc2ca8f4e7 Apr 21 10:14:36.357411 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:36.357378 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" event={"ID":"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8","Type":"ContainerStarted","Data":"c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588"} Apr 21 10:14:36.357411 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:36.357413 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" event={"ID":"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8","Type":"ContainerStarted","Data":"b2f56fb30e36ef1bf0dcdf5ddd186b38ba5692de0f52ecff04680acc2ca8f4e7"} Apr 21 10:14:36.357608 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:36.357506 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:14:36.372146 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:36.372083 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" podStartSLOduration=2.372068795 podStartE2EDuration="2.372068795s" podCreationTimestamp="2026-04-21 10:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:14:36.370348778 +0000 UTC m=+644.714798663" watchObservedRunningTime="2026-04-21 10:14:36.372068795 +0000 UTC m=+644.716518679" Apr 21 10:14:42.366541 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:14:42.366511 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:18:52.158351 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:18:52.158270 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:18:52.158904 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:18:52.158767 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:22:23.681086 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:23.681056 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7"] Apr 21 10:22:23.683604 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:23.681390 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" podUID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" containerName="switch-graph-b4346" containerID="cri-o://ce11aeaff9531833cba1512b3d92fde97910881dd01a4652d89a6a005ba4072a" gracePeriod=30 Apr 21 10:22:26.287659 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:26.287613 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" podUID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" containerName="switch-graph-b4346" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:22:31.287425 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:31.287388 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" podUID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" containerName="switch-graph-b4346" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:22:36.287049 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:36.287007 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" podUID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" containerName="switch-graph-b4346" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:22:36.287516 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:36.287131 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:22:41.286939 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:41.286898 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" podUID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" containerName="switch-graph-b4346" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:22:46.287636 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:46.287602 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" podUID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" containerName="switch-graph-b4346" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:22:49.554751 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:49.554719 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p"] Apr 21 10:22:49.555305 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:49.554953 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" podUID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" containerName="sequence-graph-cd9c7" containerID="cri-o://c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588" gracePeriod=30 Apr 21 10:22:51.287735 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:51.287694 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" podUID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" containerName="switch-graph-b4346" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:22:52.364704 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:52.364664 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" podUID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" containerName="sequence-graph-cd9c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:22:53.780929 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:53.780897 2567 generic.go:358] "Generic (PLEG): container finished" podID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" containerID="ce11aeaff9531833cba1512b3d92fde97910881dd01a4652d89a6a005ba4072a" exitCode=0 Apr 21 10:22:53.781284 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:53.780952 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" event={"ID":"c6fd55cb-e9cd-4e01-952c-7d571a1efca7","Type":"ContainerDied","Data":"ce11aeaff9531833cba1512b3d92fde97910881dd01a4652d89a6a005ba4072a"} Apr 21 10:22:53.817552 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:53.817532 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:22:53.912145 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:53.912104 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-proxy-tls\") pod \"c6fd55cb-e9cd-4e01-952c-7d571a1efca7\" (UID: \"c6fd55cb-e9cd-4e01-952c-7d571a1efca7\") " Apr 21 10:22:53.912296 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:53.912187 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-openshift-service-ca-bundle\") pod \"c6fd55cb-e9cd-4e01-952c-7d571a1efca7\" (UID: \"c6fd55cb-e9cd-4e01-952c-7d571a1efca7\") " Apr 21 10:22:53.912483 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:53.912461 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c6fd55cb-e9cd-4e01-952c-7d571a1efca7" (UID: "c6fd55cb-e9cd-4e01-952c-7d571a1efca7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:22:53.913969 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:53.913952 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c6fd55cb-e9cd-4e01-952c-7d571a1efca7" (UID: "c6fd55cb-e9cd-4e01-952c-7d571a1efca7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:22:54.013040 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:54.012979 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-openshift-service-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:22:54.013040 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:54.013001 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6fd55cb-e9cd-4e01-952c-7d571a1efca7-proxy-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:22:54.784731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:54.784701 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" Apr 21 10:22:54.784731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:54.784716 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7" event={"ID":"c6fd55cb-e9cd-4e01-952c-7d571a1efca7","Type":"ContainerDied","Data":"eee19b3b831a700c97fffc484cab9a4f1643fe501c5390aab603c524fc39afe6"} Apr 21 10:22:54.785270 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:54.784759 2567 scope.go:117] "RemoveContainer" containerID="ce11aeaff9531833cba1512b3d92fde97910881dd01a4652d89a6a005ba4072a" Apr 21 10:22:54.800421 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:54.800388 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7"] Apr 21 10:22:54.804142 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:54.804100 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b4346-6b777556b4-xkcl7"] Apr 21 10:22:56.256205 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:56.256173 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" path="/var/lib/kubelet/pods/c6fd55cb-e9cd-4e01-952c-7d571a1efca7/volumes" Apr 21 10:22:57.364370 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:22:57.364331 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" podUID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" containerName="sequence-graph-cd9c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:23:02.364488 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:02.364447 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" podUID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" containerName="sequence-graph-cd9c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:23:02.364874 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:02.364569 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:23:07.364436 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:07.364399 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" podUID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" containerName="sequence-graph-cd9c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:23:12.364771 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:12.364685 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" podUID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" containerName="sequence-graph-cd9c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:23:17.364415 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:17.364377 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" podUID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" containerName="sequence-graph-cd9c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:23:19.682397 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.682376 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:23:19.806488 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.806458 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-proxy-tls\") pod \"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8\" (UID: \"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8\") " Apr 21 10:23:19.806678 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.806533 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-openshift-service-ca-bundle\") pod \"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8\" (UID: \"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8\") " Apr 21 10:23:19.806881 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.806851 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" (UID: "08cc312b-fc20-4c9b-8df3-b42f2a0b71f8"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:23:19.808552 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.808500 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" (UID: "08cc312b-fc20-4c9b-8df3-b42f2a0b71f8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:23:19.858342 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.858308 2567 generic.go:358] "Generic (PLEG): container finished" podID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" containerID="c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588" exitCode=0 Apr 21 10:23:19.858470 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.858348 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" event={"ID":"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8","Type":"ContainerDied","Data":"c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588"} Apr 21 10:23:19.858470 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.858364 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" Apr 21 10:23:19.858470 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.858382 2567 scope.go:117] "RemoveContainer" containerID="c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588" Apr 21 10:23:19.858470 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.858370 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p" event={"ID":"08cc312b-fc20-4c9b-8df3-b42f2a0b71f8","Type":"ContainerDied","Data":"b2f56fb30e36ef1bf0dcdf5ddd186b38ba5692de0f52ecff04680acc2ca8f4e7"} Apr 21 10:23:19.866543 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.866525 2567 scope.go:117] "RemoveContainer" containerID="c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588" Apr 21 10:23:19.866754 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:23:19.866739 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588\": container with ID starting with c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588 not found: ID does not exist" containerID="c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588" Apr 21 10:23:19.866806 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.866760 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588"} err="failed to get container status \"c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588\": rpc error: code = NotFound desc = could not find container \"c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588\": container with ID starting with c0142ac33d8df5e8a8e8c889ce69d9d47ae137d5ebe496751c4781f83d7bf588 not found: ID does not exist" Apr 21 10:23:19.880060 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.880034 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p"] Apr 21 10:23:19.883530 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.883510 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-cd9c7-b856cb664-jsx2p"] Apr 21 10:23:19.907462 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.907439 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-proxy-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:23:19.907462 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:19.907459 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8-openshift-service-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:23:20.256820 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:20.256746 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" path="/var/lib/kubelet/pods/08cc312b-fc20-4c9b-8df3-b42f2a0b71f8/volumes" Apr 21 10:23:33.949337 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.949238 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926"] Apr 21 10:23:33.949858 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.949658 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" containerName="switch-graph-b4346" Apr 21 10:23:33.949858 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.949677 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" containerName="switch-graph-b4346" Apr 21 10:23:33.949858 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.949711 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" containerName="sequence-graph-cd9c7" Apr 21 10:23:33.949858 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.949720 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" containerName="sequence-graph-cd9c7" Apr 21 10:23:33.949858 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.949786 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="08cc312b-fc20-4c9b-8df3-b42f2a0b71f8" containerName="sequence-graph-cd9c7" Apr 21 10:23:33.949858 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.949795 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6fd55cb-e9cd-4e01-952c-7d571a1efca7" containerName="switch-graph-b4346" Apr 21 10:23:33.953955 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.953932 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926"] Apr 21 10:23:33.954090 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.954058 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:23:33.956416 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.956391 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-42a8c-kube-rbac-proxy-sar-config\"" Apr 21 10:23:33.956544 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.956462 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5m4mw\"" Apr 21 10:23:33.956613 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.956595 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-42a8c-serving-cert\"" Apr 21 10:23:33.956698 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:33.956676 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 10:23:34.017385 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.017361 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b2888ca-ff50-4e82-8e5d-524fceb40042-proxy-tls\") pod \"ensemble-graph-42a8c-86c8649578-qg926\" (UID: \"5b2888ca-ff50-4e82-8e5d-524fceb40042\") " pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:23:34.017504 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.017401 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b2888ca-ff50-4e82-8e5d-524fceb40042-openshift-service-ca-bundle\") pod \"ensemble-graph-42a8c-86c8649578-qg926\" (UID: \"5b2888ca-ff50-4e82-8e5d-524fceb40042\") " pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:23:34.118402 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.118376 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b2888ca-ff50-4e82-8e5d-524fceb40042-proxy-tls\") pod \"ensemble-graph-42a8c-86c8649578-qg926\" (UID: \"5b2888ca-ff50-4e82-8e5d-524fceb40042\") " pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:23:34.118548 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.118409 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b2888ca-ff50-4e82-8e5d-524fceb40042-openshift-service-ca-bundle\") pod \"ensemble-graph-42a8c-86c8649578-qg926\" (UID: \"5b2888ca-ff50-4e82-8e5d-524fceb40042\") " pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:23:34.118977 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.118958 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b2888ca-ff50-4e82-8e5d-524fceb40042-openshift-service-ca-bundle\") pod \"ensemble-graph-42a8c-86c8649578-qg926\" (UID: \"5b2888ca-ff50-4e82-8e5d-524fceb40042\") " pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:23:34.120567 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.120545 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b2888ca-ff50-4e82-8e5d-524fceb40042-proxy-tls\") pod \"ensemble-graph-42a8c-86c8649578-qg926\" (UID: \"5b2888ca-ff50-4e82-8e5d-524fceb40042\") " pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:23:34.265000 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.264949 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:23:34.380337 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.380312 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926"] Apr 21 10:23:34.383001 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:23:34.382976 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2888ca_ff50_4e82_8e5d_524fceb40042.slice/crio-1c00e85f277e6daea00be19582114b622f05c2945a67ecd73848e5ec97db4e58 WatchSource:0}: Error finding container 1c00e85f277e6daea00be19582114b622f05c2945a67ecd73848e5ec97db4e58: Status 404 returned error can't find the container with id 1c00e85f277e6daea00be19582114b622f05c2945a67ecd73848e5ec97db4e58 Apr 21 10:23:34.385182 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.385167 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:23:34.901199 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.901160 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" event={"ID":"5b2888ca-ff50-4e82-8e5d-524fceb40042","Type":"ContainerStarted","Data":"52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3"} Apr 21 10:23:34.901199 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.901197 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" event={"ID":"5b2888ca-ff50-4e82-8e5d-524fceb40042","Type":"ContainerStarted","Data":"1c00e85f277e6daea00be19582114b622f05c2945a67ecd73848e5ec97db4e58"} Apr 21 10:23:34.901430 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.901252 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:23:34.916791 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:34.916755 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" podStartSLOduration=1.9167416130000001 podStartE2EDuration="1.916741613s" podCreationTimestamp="2026-04-21 10:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:23:34.915413498 +0000 UTC m=+1183.259863384" watchObservedRunningTime="2026-04-21 10:23:34.916741613 +0000 UTC m=+1183.261191498" Apr 21 10:23:40.909536 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:40.909509 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:23:44.004029 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:44.003990 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926"] Apr 21 10:23:44.004474 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:44.004254 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" podUID="5b2888ca-ff50-4e82-8e5d-524fceb40042" containerName="ensemble-graph-42a8c" containerID="cri-o://52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3" gracePeriod=30 Apr 21 10:23:45.907227 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:45.907188 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" podUID="5b2888ca-ff50-4e82-8e5d-524fceb40042" containerName="ensemble-graph-42a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:23:50.908427 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:50.908384 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" podUID="5b2888ca-ff50-4e82-8e5d-524fceb40042" containerName="ensemble-graph-42a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:23:52.179689 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:52.179658 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:23:52.180404 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:52.180382 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:23:55.907359 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:55.907322 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" podUID="5b2888ca-ff50-4e82-8e5d-524fceb40042" containerName="ensemble-graph-42a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:23:55.907803 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:55.907470 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:23:59.699801 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:59.699765 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx"] Apr 21 10:23:59.704393 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:59.704376 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:23:59.706896 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:59.706876 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-3f0d8-serving-cert\"" Apr 21 10:23:59.707013 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:59.706876 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-3f0d8-kube-rbac-proxy-sar-config\"" Apr 21 10:23:59.711892 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:59.711876 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx"] Apr 21 10:23:59.821275 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:59.821249 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2645bec2-eaa6-4767-8303-de65b9ff786e-proxy-tls\") pod \"sequence-graph-3f0d8-647745dd4-j58jx\" (UID: \"2645bec2-eaa6-4767-8303-de65b9ff786e\") " pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:23:59.821409 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:59.821303 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2645bec2-eaa6-4767-8303-de65b9ff786e-openshift-service-ca-bundle\") pod \"sequence-graph-3f0d8-647745dd4-j58jx\" (UID: \"2645bec2-eaa6-4767-8303-de65b9ff786e\") " pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:23:59.922716 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:59.922684 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2645bec2-eaa6-4767-8303-de65b9ff786e-openshift-service-ca-bundle\") pod \"sequence-graph-3f0d8-647745dd4-j58jx\" (UID: \"2645bec2-eaa6-4767-8303-de65b9ff786e\") " pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:23:59.922841 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:59.922783 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2645bec2-eaa6-4767-8303-de65b9ff786e-proxy-tls\") pod \"sequence-graph-3f0d8-647745dd4-j58jx\" (UID: \"2645bec2-eaa6-4767-8303-de65b9ff786e\") " pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:23:59.923324 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:59.923301 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2645bec2-eaa6-4767-8303-de65b9ff786e-openshift-service-ca-bundle\") pod \"sequence-graph-3f0d8-647745dd4-j58jx\" (UID: \"2645bec2-eaa6-4767-8303-de65b9ff786e\") " pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:23:59.925292 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:23:59.925272 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2645bec2-eaa6-4767-8303-de65b9ff786e-proxy-tls\") pod \"sequence-graph-3f0d8-647745dd4-j58jx\" (UID: \"2645bec2-eaa6-4767-8303-de65b9ff786e\") " pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:24:00.015292 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:00.015215 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:24:00.131978 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:00.131818 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx"] Apr 21 10:24:00.907777 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:00.907729 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" podUID="5b2888ca-ff50-4e82-8e5d-524fceb40042" containerName="ensemble-graph-42a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:24:00.979230 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:00.979198 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" event={"ID":"2645bec2-eaa6-4767-8303-de65b9ff786e","Type":"ContainerStarted","Data":"82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51"} Apr 21 10:24:00.979375 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:00.979235 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" event={"ID":"2645bec2-eaa6-4767-8303-de65b9ff786e","Type":"ContainerStarted","Data":"3772cfabd71334e3d4010bf0d92308aedfd8a363f9f77c72e54101aca6eb737c"} Apr 21 10:24:00.979375 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:00.979271 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:24:00.996495 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:00.996454 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" podStartSLOduration=1.9964428440000002 podStartE2EDuration="1.996442844s" podCreationTimestamp="2026-04-21 10:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:24:00.996192354 +0000 UTC m=+1209.340642238" watchObservedRunningTime="2026-04-21 10:24:00.996442844 +0000 UTC m=+1209.340892757" Apr 21 10:24:05.907757 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:05.907720 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" podUID="5b2888ca-ff50-4e82-8e5d-524fceb40042" containerName="ensemble-graph-42a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:24:06.988992 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:06.988963 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:24:09.771811 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:09.771774 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx"] Apr 21 10:24:09.772198 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:09.771983 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" podUID="2645bec2-eaa6-4767-8303-de65b9ff786e" containerName="sequence-graph-3f0d8" containerID="cri-o://82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51" gracePeriod=30 Apr 21 10:24:10.908533 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:10.908492 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" podUID="5b2888ca-ff50-4e82-8e5d-524fceb40042" containerName="ensemble-graph-42a8c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:24:11.987032 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:11.986995 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" podUID="2645bec2-eaa6-4767-8303-de65b9ff786e" containerName="sequence-graph-3f0d8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:24:14.059165 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:24:14.059132 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2888ca_ff50_4e82_8e5d_524fceb40042.slice/crio-1c00e85f277e6daea00be19582114b622f05c2945a67ecd73848e5ec97db4e58\": RecentStats: unable to find data in memory cache]" Apr 21 10:24:14.059552 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:24:14.059335 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2888ca_ff50_4e82_8e5d_524fceb40042.slice/crio-1c00e85f277e6daea00be19582114b622f05c2945a67ecd73848e5ec97db4e58\": RecentStats: unable to find data in memory cache]" Apr 21 10:24:14.059616 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:24:14.059443 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2888ca_ff50_4e82_8e5d_524fceb40042.slice/crio-1c00e85f277e6daea00be19582114b622f05c2945a67ecd73848e5ec97db4e58\": RecentStats: unable to find data in memory cache]" Apr 21 10:24:14.059941 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:24:14.059899 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2888ca_ff50_4e82_8e5d_524fceb40042.slice/crio-1c00e85f277e6daea00be19582114b622f05c2945a67ecd73848e5ec97db4e58\": RecentStats: unable to find data in memory cache]" Apr 21 10:24:14.191056 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:14.191034 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:24:14.334208 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:14.334172 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b2888ca-ff50-4e82-8e5d-524fceb40042-proxy-tls\") pod \"5b2888ca-ff50-4e82-8e5d-524fceb40042\" (UID: \"5b2888ca-ff50-4e82-8e5d-524fceb40042\") " Apr 21 10:24:14.334208 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:14.334214 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b2888ca-ff50-4e82-8e5d-524fceb40042-openshift-service-ca-bundle\") pod \"5b2888ca-ff50-4e82-8e5d-524fceb40042\" (UID: \"5b2888ca-ff50-4e82-8e5d-524fceb40042\") " Apr 21 10:24:14.334589 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:14.334562 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b2888ca-ff50-4e82-8e5d-524fceb40042-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "5b2888ca-ff50-4e82-8e5d-524fceb40042" (UID: "5b2888ca-ff50-4e82-8e5d-524fceb40042"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:24:14.336165 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:14.336127 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b2888ca-ff50-4e82-8e5d-524fceb40042-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5b2888ca-ff50-4e82-8e5d-524fceb40042" (UID: "5b2888ca-ff50-4e82-8e5d-524fceb40042"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:24:14.435300 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:14.435273 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b2888ca-ff50-4e82-8e5d-524fceb40042-proxy-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:24:14.435300 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:14.435296 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b2888ca-ff50-4e82-8e5d-524fceb40042-openshift-service-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:24:15.020127 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:15.020085 2567 generic.go:358] "Generic (PLEG): container finished" podID="5b2888ca-ff50-4e82-8e5d-524fceb40042" containerID="52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3" exitCode=137 Apr 21 10:24:15.020289 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:15.020172 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" event={"ID":"5b2888ca-ff50-4e82-8e5d-524fceb40042","Type":"ContainerDied","Data":"52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3"} Apr 21 10:24:15.020289 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:15.020213 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" event={"ID":"5b2888ca-ff50-4e82-8e5d-524fceb40042","Type":"ContainerDied","Data":"1c00e85f277e6daea00be19582114b622f05c2945a67ecd73848e5ec97db4e58"} Apr 21 10:24:15.020289 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:15.020229 2567 scope.go:117] "RemoveContainer" containerID="52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3" Apr 21 10:24:15.020289 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:15.020186 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926" Apr 21 10:24:15.028821 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:15.028804 2567 scope.go:117] "RemoveContainer" containerID="52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3" Apr 21 10:24:15.029055 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:24:15.029038 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3\": container with ID starting with 52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3 not found: ID does not exist" containerID="52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3" Apr 21 10:24:15.029126 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:15.029063 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3"} err="failed to get container status \"52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3\": rpc error: code = NotFound desc = could not find container \"52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3\": container with ID starting with 52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3 not found: ID does not exist" Apr 21 10:24:15.040621 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:15.040597 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926"] Apr 21 10:24:15.046094 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:15.046070 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-42a8c-86c8649578-qg926"] Apr 21 10:24:16.257756 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:16.257718 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b2888ca-ff50-4e82-8e5d-524fceb40042" path="/var/lib/kubelet/pods/5b2888ca-ff50-4e82-8e5d-524fceb40042/volumes" Apr 21 10:24:16.987078 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:16.987040 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" podUID="2645bec2-eaa6-4767-8303-de65b9ff786e" containerName="sequence-graph-3f0d8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:24:21.986218 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:21.986181 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" podUID="2645bec2-eaa6-4767-8303-de65b9ff786e" containerName="sequence-graph-3f0d8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:24:21.986591 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:21.986286 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:24:26.986885 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:26.986846 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" podUID="2645bec2-eaa6-4767-8303-de65b9ff786e" containerName="sequence-graph-3f0d8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:24:31.986868 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:31.986824 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" podUID="2645bec2-eaa6-4767-8303-de65b9ff786e" containerName="sequence-graph-3f0d8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:24:34.429633 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:24:34.429605 2567 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/22a3ce8104e3be9de1dffebb7cc741b81a7c20e100bcc04c97e5f8a62888cbb1/diff" to get inode usage: stat /var/lib/containers/storage/overlay/22a3ce8104e3be9de1dffebb7cc741b81a7c20e100bcc04c97e5f8a62888cbb1/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/kserve-ci-e2e-test_ensemble-graph-42a8c-86c8649578-qg926_5b2888ca-ff50-4e82-8e5d-524fceb40042/ensemble-graph-42a8c/0.log" to get inode usage: stat /var/log/pods/kserve-ci-e2e-test_ensemble-graph-42a8c-86c8649578-qg926_5b2888ca-ff50-4e82-8e5d-524fceb40042/ensemble-graph-42a8c/0.log: no such file or directory Apr 21 10:24:36.986466 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:36.986423 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" podUID="2645bec2-eaa6-4767-8303-de65b9ff786e" containerName="sequence-graph-3f0d8" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:24:39.800345 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:24:39.800200 2567 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2888ca_ff50_4e82_8e5d_524fceb40042.slice/crio-52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2645bec2_eaa6_4767_8303_de65b9ff786e.slice/crio-conmon-82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2888ca_ff50_4e82_8e5d_524fceb40042.slice/crio-conmon-52b60556c6f7984a5c7d966e9f6ebe0805b28e42e9807727943b06ac101d3bc3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2888ca_ff50_4e82_8e5d_524fceb40042.slice/crio-1c00e85f277e6daea00be19582114b622f05c2945a67ecd73848e5ec97db4e58\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2888ca_ff50_4e82_8e5d_524fceb40042.slice\": RecentStats: unable to find data in memory cache]" Apr 21 10:24:39.919689 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:39.919663 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:24:39.929786 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:39.929765 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2645bec2-eaa6-4767-8303-de65b9ff786e-openshift-service-ca-bundle\") pod \"2645bec2-eaa6-4767-8303-de65b9ff786e\" (UID: \"2645bec2-eaa6-4767-8303-de65b9ff786e\") " Apr 21 10:24:39.929878 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:39.929847 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2645bec2-eaa6-4767-8303-de65b9ff786e-proxy-tls\") pod \"2645bec2-eaa6-4767-8303-de65b9ff786e\" (UID: \"2645bec2-eaa6-4767-8303-de65b9ff786e\") " Apr 21 10:24:39.930103 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:39.930080 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2645bec2-eaa6-4767-8303-de65b9ff786e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2645bec2-eaa6-4767-8303-de65b9ff786e" (UID: "2645bec2-eaa6-4767-8303-de65b9ff786e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:24:39.931668 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:39.931650 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2645bec2-eaa6-4767-8303-de65b9ff786e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2645bec2-eaa6-4767-8303-de65b9ff786e" (UID: "2645bec2-eaa6-4767-8303-de65b9ff786e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:24:40.030540 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.030513 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2645bec2-eaa6-4767-8303-de65b9ff786e-proxy-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:24:40.030672 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.030543 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2645bec2-eaa6-4767-8303-de65b9ff786e-openshift-service-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:24:40.090020 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.089986 2567 generic.go:358] "Generic (PLEG): container finished" podID="2645bec2-eaa6-4767-8303-de65b9ff786e" containerID="82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51" exitCode=0 Apr 21 10:24:40.090166 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.090043 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" Apr 21 10:24:40.090166 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.090044 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" event={"ID":"2645bec2-eaa6-4767-8303-de65b9ff786e","Type":"ContainerDied","Data":"82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51"} Apr 21 10:24:40.090166 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.090087 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx" event={"ID":"2645bec2-eaa6-4767-8303-de65b9ff786e","Type":"ContainerDied","Data":"3772cfabd71334e3d4010bf0d92308aedfd8a363f9f77c72e54101aca6eb737c"} Apr 21 10:24:40.090166 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.090104 2567 scope.go:117] "RemoveContainer" containerID="82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51" Apr 21 10:24:40.100379 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.100341 2567 scope.go:117] "RemoveContainer" containerID="82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51" Apr 21 10:24:40.100673 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:24:40.100653 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51\": container with ID starting with 82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51 not found: ID does not exist" containerID="82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51" Apr 21 10:24:40.100753 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.100681 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51"} err="failed to get container status \"82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51\": rpc error: code = NotFound desc = could not find container \"82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51\": container with ID starting with 82773c322890c07655f028382baaae5b6837db69c45d5084d6907f826c840f51 not found: ID does not exist" Apr 21 10:24:40.112695 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.112670 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx"] Apr 21 10:24:40.115507 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.115481 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3f0d8-647745dd4-j58jx"] Apr 21 10:24:40.257043 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:40.257017 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2645bec2-eaa6-4767-8303-de65b9ff786e" path="/var/lib/kubelet/pods/2645bec2-eaa6-4767-8303-de65b9ff786e/volumes" Apr 21 10:24:54.217266 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.217231 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb"] Apr 21 10:24:54.217644 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.217572 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b2888ca-ff50-4e82-8e5d-524fceb40042" containerName="ensemble-graph-42a8c" Apr 21 10:24:54.217644 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.217586 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2888ca-ff50-4e82-8e5d-524fceb40042" containerName="ensemble-graph-42a8c" Apr 21 10:24:54.217644 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.217599 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2645bec2-eaa6-4767-8303-de65b9ff786e" containerName="sequence-graph-3f0d8" Apr 21 10:24:54.217644 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.217605 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2645bec2-eaa6-4767-8303-de65b9ff786e" containerName="sequence-graph-3f0d8" Apr 21 10:24:54.217774 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.217652 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="2645bec2-eaa6-4767-8303-de65b9ff786e" containerName="sequence-graph-3f0d8" Apr 21 10:24:54.217774 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.217664 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b2888ca-ff50-4e82-8e5d-524fceb40042" containerName="ensemble-graph-42a8c" Apr 21 10:24:54.221802 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.221783 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:24:54.224054 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.224034 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-c50a0-kube-rbac-proxy-sar-config\"" Apr 21 10:24:54.224179 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.224038 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-c50a0-serving-cert\"" Apr 21 10:24:54.224364 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.224349 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 10:24:54.224959 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.224946 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5m4mw\"" Apr 21 10:24:54.230004 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.229984 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb"] Apr 21 10:24:54.343277 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.343240 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7efb65b-b284-4887-b32b-ab6a2499a07f-openshift-service-ca-bundle\") pod \"ensemble-graph-c50a0-7b8946587d-pz8fb\" (UID: \"e7efb65b-b284-4887-b32b-ab6a2499a07f\") " pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:24:54.343423 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.343306 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7efb65b-b284-4887-b32b-ab6a2499a07f-proxy-tls\") pod \"ensemble-graph-c50a0-7b8946587d-pz8fb\" (UID: \"e7efb65b-b284-4887-b32b-ab6a2499a07f\") " pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:24:54.444265 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.444235 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7efb65b-b284-4887-b32b-ab6a2499a07f-openshift-service-ca-bundle\") pod \"ensemble-graph-c50a0-7b8946587d-pz8fb\" (UID: \"e7efb65b-b284-4887-b32b-ab6a2499a07f\") " pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:24:54.444401 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.444287 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7efb65b-b284-4887-b32b-ab6a2499a07f-proxy-tls\") pod \"ensemble-graph-c50a0-7b8946587d-pz8fb\" (UID: \"e7efb65b-b284-4887-b32b-ab6a2499a07f\") " pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:24:54.444995 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.444972 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7efb65b-b284-4887-b32b-ab6a2499a07f-openshift-service-ca-bundle\") pod \"ensemble-graph-c50a0-7b8946587d-pz8fb\" (UID: \"e7efb65b-b284-4887-b32b-ab6a2499a07f\") " pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:24:54.446613 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.446588 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7efb65b-b284-4887-b32b-ab6a2499a07f-proxy-tls\") pod \"ensemble-graph-c50a0-7b8946587d-pz8fb\" (UID: \"e7efb65b-b284-4887-b32b-ab6a2499a07f\") " pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:24:54.532430 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.532403 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:24:54.646579 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:54.646554 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb"] Apr 21 10:24:54.649067 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:24:54.649030 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7efb65b_b284_4887_b32b_ab6a2499a07f.slice/crio-074b0886eaa077a48db11bb278ba4c6033aa61dd5df79548629ca244ea5bc7ef WatchSource:0}: Error finding container 074b0886eaa077a48db11bb278ba4c6033aa61dd5df79548629ca244ea5bc7ef: Status 404 returned error can't find the container with id 074b0886eaa077a48db11bb278ba4c6033aa61dd5df79548629ca244ea5bc7ef Apr 21 10:24:55.134434 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:55.134388 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" event={"ID":"e7efb65b-b284-4887-b32b-ab6a2499a07f","Type":"ContainerStarted","Data":"1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece"} Apr 21 10:24:55.134434 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:55.134428 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" event={"ID":"e7efb65b-b284-4887-b32b-ab6a2499a07f","Type":"ContainerStarted","Data":"074b0886eaa077a48db11bb278ba4c6033aa61dd5df79548629ca244ea5bc7ef"} Apr 21 10:24:55.134643 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:55.134539 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:24:55.150506 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:24:55.150468 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" podStartSLOduration=1.150456951 podStartE2EDuration="1.150456951s" podCreationTimestamp="2026-04-21 10:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:24:55.149168545 +0000 UTC m=+1263.493618433" watchObservedRunningTime="2026-04-21 10:24:55.150456951 +0000 UTC m=+1263.494906837" Apr 21 10:25:01.143082 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:01.143052 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:25:19.991944 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:19.991912 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v"] Apr 21 10:25:19.995430 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:19.995407 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:25:19.997501 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:19.997478 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-258c6-serving-cert\"" Apr 21 10:25:19.997612 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:19.997527 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-258c6-kube-rbac-proxy-sar-config\"" Apr 21 10:25:20.003743 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:20.003720 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v"] Apr 21 10:25:20.143887 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:20.143849 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc39bf1d-9a28-407e-b978-72c038a130cb-proxy-tls\") pod \"sequence-graph-258c6-b67dc4465-ct42v\" (UID: \"bc39bf1d-9a28-407e-b978-72c038a130cb\") " pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:25:20.144034 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:20.143940 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc39bf1d-9a28-407e-b978-72c038a130cb-openshift-service-ca-bundle\") pod \"sequence-graph-258c6-b67dc4465-ct42v\" (UID: \"bc39bf1d-9a28-407e-b978-72c038a130cb\") " pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:25:20.245240 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:20.245170 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc39bf1d-9a28-407e-b978-72c038a130cb-proxy-tls\") pod \"sequence-graph-258c6-b67dc4465-ct42v\" (UID: \"bc39bf1d-9a28-407e-b978-72c038a130cb\") " pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:25:20.245240 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:20.245218 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc39bf1d-9a28-407e-b978-72c038a130cb-openshift-service-ca-bundle\") pod \"sequence-graph-258c6-b67dc4465-ct42v\" (UID: \"bc39bf1d-9a28-407e-b978-72c038a130cb\") " pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:25:20.245769 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:20.245749 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc39bf1d-9a28-407e-b978-72c038a130cb-openshift-service-ca-bundle\") pod \"sequence-graph-258c6-b67dc4465-ct42v\" (UID: \"bc39bf1d-9a28-407e-b978-72c038a130cb\") " pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:25:20.247577 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:20.247553 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc39bf1d-9a28-407e-b978-72c038a130cb-proxy-tls\") pod \"sequence-graph-258c6-b67dc4465-ct42v\" (UID: \"bc39bf1d-9a28-407e-b978-72c038a130cb\") " pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:25:20.305983 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:20.305951 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:25:20.426236 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:20.426195 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v"] Apr 21 10:25:20.428840 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:25:20.428811 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc39bf1d_9a28_407e_b978_72c038a130cb.slice/crio-21845a69c668acee01ee93f6afa2d1fa3c41c3511befbc46459fc7780ace8fb0 WatchSource:0}: Error finding container 21845a69c668acee01ee93f6afa2d1fa3c41c3511befbc46459fc7780ace8fb0: Status 404 returned error can't find the container with id 21845a69c668acee01ee93f6afa2d1fa3c41c3511befbc46459fc7780ace8fb0 Apr 21 10:25:21.211211 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:21.211174 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" event={"ID":"bc39bf1d-9a28-407e-b978-72c038a130cb","Type":"ContainerStarted","Data":"395ee79992045e431e6c298a7c6b977bddcbe17e84ef8af1bfa32c7692893e84"} Apr 21 10:25:21.211211 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:21.211211 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" event={"ID":"bc39bf1d-9a28-407e-b978-72c038a130cb","Type":"ContainerStarted","Data":"21845a69c668acee01ee93f6afa2d1fa3c41c3511befbc46459fc7780ace8fb0"} Apr 21 10:25:21.211638 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:21.211236 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:25:21.226563 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:21.226518 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" podStartSLOduration=2.226503535 podStartE2EDuration="2.226503535s" podCreationTimestamp="2026-04-21 10:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:25:21.22496232 +0000 UTC m=+1289.569412217" watchObservedRunningTime="2026-04-21 10:25:21.226503535 +0000 UTC m=+1289.570953420" Apr 21 10:25:27.219888 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:25:27.219861 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:28:52.199832 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:28:52.199801 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:28:52.202161 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:28:52.202132 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:33:08.912253 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:08.912221 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb"] Apr 21 10:33:08.914753 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:08.912470 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" podUID="e7efb65b-b284-4887-b32b-ab6a2499a07f" containerName="ensemble-graph-c50a0" containerID="cri-o://1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece" gracePeriod=30 Apr 21 10:33:11.141528 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:11.141491 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" podUID="e7efb65b-b284-4887-b32b-ab6a2499a07f" containerName="ensemble-graph-c50a0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:33:16.141131 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:16.141081 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" podUID="e7efb65b-b284-4887-b32b-ab6a2499a07f" containerName="ensemble-graph-c50a0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:33:21.141998 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:21.141960 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" podUID="e7efb65b-b284-4887-b32b-ab6a2499a07f" containerName="ensemble-graph-c50a0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:33:21.142442 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:21.142080 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:33:26.141617 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:26.141581 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" podUID="e7efb65b-b284-4887-b32b-ab6a2499a07f" containerName="ensemble-graph-c50a0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:33:31.141626 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:31.141587 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" podUID="e7efb65b-b284-4887-b32b-ab6a2499a07f" containerName="ensemble-graph-c50a0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:33:34.673617 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:34.673586 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v"] Apr 21 10:33:34.674090 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:34.673807 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" podUID="bc39bf1d-9a28-407e-b978-72c038a130cb" containerName="sequence-graph-258c6" containerID="cri-o://395ee79992045e431e6c298a7c6b977bddcbe17e84ef8af1bfa32c7692893e84" gracePeriod=30 Apr 21 10:33:36.141489 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:36.141451 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" podUID="e7efb65b-b284-4887-b32b-ab6a2499a07f" containerName="ensemble-graph-c50a0" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:33:37.218037 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:37.218001 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" podUID="bc39bf1d-9a28-407e-b978-72c038a130cb" containerName="sequence-graph-258c6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:33:39.046119 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.046094 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:33:39.090366 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.090340 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7efb65b-b284-4887-b32b-ab6a2499a07f-proxy-tls\") pod \"e7efb65b-b284-4887-b32b-ab6a2499a07f\" (UID: \"e7efb65b-b284-4887-b32b-ab6a2499a07f\") " Apr 21 10:33:39.090482 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.090385 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7efb65b-b284-4887-b32b-ab6a2499a07f-openshift-service-ca-bundle\") pod \"e7efb65b-b284-4887-b32b-ab6a2499a07f\" (UID: \"e7efb65b-b284-4887-b32b-ab6a2499a07f\") " Apr 21 10:33:39.090727 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.090700 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7efb65b-b284-4887-b32b-ab6a2499a07f-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "e7efb65b-b284-4887-b32b-ab6a2499a07f" (UID: "e7efb65b-b284-4887-b32b-ab6a2499a07f"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:33:39.092120 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.092091 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7efb65b-b284-4887-b32b-ab6a2499a07f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e7efb65b-b284-4887-b32b-ab6a2499a07f" (UID: "e7efb65b-b284-4887-b32b-ab6a2499a07f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:33:39.191482 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.191411 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7efb65b-b284-4887-b32b-ab6a2499a07f-proxy-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:33:39.191482 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.191436 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7efb65b-b284-4887-b32b-ab6a2499a07f-openshift-service-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:33:39.634140 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.634092 2567 generic.go:358] "Generic (PLEG): container finished" podID="e7efb65b-b284-4887-b32b-ab6a2499a07f" containerID="1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece" exitCode=0 Apr 21 10:33:39.634332 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.634161 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" event={"ID":"e7efb65b-b284-4887-b32b-ab6a2499a07f","Type":"ContainerDied","Data":"1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece"} Apr 21 10:33:39.634332 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.634188 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" event={"ID":"e7efb65b-b284-4887-b32b-ab6a2499a07f","Type":"ContainerDied","Data":"074b0886eaa077a48db11bb278ba4c6033aa61dd5df79548629ca244ea5bc7ef"} Apr 21 10:33:39.634332 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.634202 2567 scope.go:117] "RemoveContainer" containerID="1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece" Apr 21 10:33:39.634332 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.634200 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb" Apr 21 10:33:39.643881 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.643826 2567 scope.go:117] "RemoveContainer" containerID="1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece" Apr 21 10:33:39.644403 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:33:39.644381 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece\": container with ID starting with 1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece not found: ID does not exist" containerID="1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece" Apr 21 10:33:39.644460 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.644412 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece"} err="failed to get container status \"1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece\": rpc error: code = NotFound desc = could not find container \"1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece\": container with ID starting with 1c6c0406af229c90467bf0f0f5626d6131176c6f7ed5a58972e527470d058ece not found: ID does not exist" Apr 21 10:33:39.655394 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.655370 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb"] Apr 21 10:33:39.660659 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:39.660636 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-c50a0-7b8946587d-pz8fb"] Apr 21 10:33:40.255923 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:40.255890 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7efb65b-b284-4887-b32b-ab6a2499a07f" path="/var/lib/kubelet/pods/e7efb65b-b284-4887-b32b-ab6a2499a07f/volumes" Apr 21 10:33:42.217986 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:42.217952 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" podUID="bc39bf1d-9a28-407e-b978-72c038a130cb" containerName="sequence-graph-258c6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:33:47.218069 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:47.218027 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" podUID="bc39bf1d-9a28-407e-b978-72c038a130cb" containerName="sequence-graph-258c6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:33:47.218448 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:47.218169 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:33:52.219035 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:52.218996 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" podUID="bc39bf1d-9a28-407e-b978-72c038a130cb" containerName="sequence-graph-258c6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:33:52.221386 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:52.221363 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:33:52.223727 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:52.223704 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:33:57.218266 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:33:57.218231 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" podUID="bc39bf1d-9a28-407e-b978-72c038a130cb" containerName="sequence-graph-258c6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:34:02.218081 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:02.218038 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" podUID="bc39bf1d-9a28-407e-b978-72c038a130cb" containerName="sequence-graph-258c6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:34:04.708910 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:04.708878 2567 generic.go:358] "Generic (PLEG): container finished" podID="bc39bf1d-9a28-407e-b978-72c038a130cb" containerID="395ee79992045e431e6c298a7c6b977bddcbe17e84ef8af1bfa32c7692893e84" exitCode=0 Apr 21 10:34:04.709214 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:04.708925 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" event={"ID":"bc39bf1d-9a28-407e-b978-72c038a130cb","Type":"ContainerDied","Data":"395ee79992045e431e6c298a7c6b977bddcbe17e84ef8af1bfa32c7692893e84"} Apr 21 10:34:04.823317 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:04.823296 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:34:04.888401 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:04.888375 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc39bf1d-9a28-407e-b978-72c038a130cb-openshift-service-ca-bundle\") pod \"bc39bf1d-9a28-407e-b978-72c038a130cb\" (UID: \"bc39bf1d-9a28-407e-b978-72c038a130cb\") " Apr 21 10:34:04.888538 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:04.888440 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc39bf1d-9a28-407e-b978-72c038a130cb-proxy-tls\") pod \"bc39bf1d-9a28-407e-b978-72c038a130cb\" (UID: \"bc39bf1d-9a28-407e-b978-72c038a130cb\") " Apr 21 10:34:04.888715 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:04.888692 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc39bf1d-9a28-407e-b978-72c038a130cb-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "bc39bf1d-9a28-407e-b978-72c038a130cb" (UID: "bc39bf1d-9a28-407e-b978-72c038a130cb"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:34:04.890498 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:04.890471 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc39bf1d-9a28-407e-b978-72c038a130cb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bc39bf1d-9a28-407e-b978-72c038a130cb" (UID: "bc39bf1d-9a28-407e-b978-72c038a130cb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:34:04.989056 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:04.988991 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc39bf1d-9a28-407e-b978-72c038a130cb-proxy-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:34:04.989056 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:04.989018 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc39bf1d-9a28-407e-b978-72c038a130cb-openshift-service-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:34:05.713161 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:05.713125 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" event={"ID":"bc39bf1d-9a28-407e-b978-72c038a130cb","Type":"ContainerDied","Data":"21845a69c668acee01ee93f6afa2d1fa3c41c3511befbc46459fc7780ace8fb0"} Apr 21 10:34:05.713161 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:05.713144 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v" Apr 21 10:34:05.713161 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:05.713169 2567 scope.go:117] "RemoveContainer" containerID="395ee79992045e431e6c298a7c6b977bddcbe17e84ef8af1bfa32c7692893e84" Apr 21 10:34:05.733062 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:05.733038 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v"] Apr 21 10:34:05.736473 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:05.736451 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-258c6-b67dc4465-ct42v"] Apr 21 10:34:06.256812 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:06.256780 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc39bf1d-9a28-407e-b978-72c038a130cb" path="/var/lib/kubelet/pods/bc39bf1d-9a28-407e-b978-72c038a130cb/volumes" Apr 21 10:34:19.155175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.155076 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm"] Apr 21 10:34:19.155658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.155568 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc39bf1d-9a28-407e-b978-72c038a130cb" containerName="sequence-graph-258c6" Apr 21 10:34:19.155658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.155584 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc39bf1d-9a28-407e-b978-72c038a130cb" containerName="sequence-graph-258c6" Apr 21 10:34:19.155658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.155612 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7efb65b-b284-4887-b32b-ab6a2499a07f" containerName="ensemble-graph-c50a0" Apr 21 10:34:19.155658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.155621 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7efb65b-b284-4887-b32b-ab6a2499a07f" containerName="ensemble-graph-c50a0" Apr 21 10:34:19.155883 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.155695 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc39bf1d-9a28-407e-b978-72c038a130cb" containerName="sequence-graph-258c6" Apr 21 10:34:19.155883 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.155711 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7efb65b-b284-4887-b32b-ab6a2499a07f" containerName="ensemble-graph-c50a0" Apr 21 10:34:19.160090 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.160069 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:19.162614 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.162590 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-e8429-kube-rbac-proxy-sar-config\"" Apr 21 10:34:19.162732 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.162589 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 10:34:19.163682 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.163461 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5m4mw\"" Apr 21 10:34:19.163805 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.163758 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-e8429-serving-cert\"" Apr 21 10:34:19.165951 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.165931 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm"] Apr 21 10:34:19.303769 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.303736 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61f423d3-2602-46ca-b273-f3a8143807b0-openshift-service-ca-bundle\") pod \"splitter-graph-e8429-56f48fd547-2nhfm\" (UID: \"61f423d3-2602-46ca-b273-f3a8143807b0\") " pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:19.303912 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.303792 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61f423d3-2602-46ca-b273-f3a8143807b0-proxy-tls\") pod \"splitter-graph-e8429-56f48fd547-2nhfm\" (UID: \"61f423d3-2602-46ca-b273-f3a8143807b0\") " pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:19.404669 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.404636 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61f423d3-2602-46ca-b273-f3a8143807b0-openshift-service-ca-bundle\") pod \"splitter-graph-e8429-56f48fd547-2nhfm\" (UID: \"61f423d3-2602-46ca-b273-f3a8143807b0\") " pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:19.404798 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.404688 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61f423d3-2602-46ca-b273-f3a8143807b0-proxy-tls\") pod \"splitter-graph-e8429-56f48fd547-2nhfm\" (UID: \"61f423d3-2602-46ca-b273-f3a8143807b0\") " pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:19.405401 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.405357 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61f423d3-2602-46ca-b273-f3a8143807b0-openshift-service-ca-bundle\") pod \"splitter-graph-e8429-56f48fd547-2nhfm\" (UID: \"61f423d3-2602-46ca-b273-f3a8143807b0\") " pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:19.407009 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.406990 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61f423d3-2602-46ca-b273-f3a8143807b0-proxy-tls\") pod \"splitter-graph-e8429-56f48fd547-2nhfm\" (UID: \"61f423d3-2602-46ca-b273-f3a8143807b0\") " pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:19.472190 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.472167 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:19.586560 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.586530 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm"] Apr 21 10:34:19.593843 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.593824 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:34:19.752765 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.752675 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" event={"ID":"61f423d3-2602-46ca-b273-f3a8143807b0","Type":"ContainerStarted","Data":"972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1"} Apr 21 10:34:19.752765 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.752722 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" event={"ID":"61f423d3-2602-46ca-b273-f3a8143807b0","Type":"ContainerStarted","Data":"6682a2085eb04abcc301850eff8d17f3926f7913a353d3b155faf95d9610e568"} Apr 21 10:34:19.752972 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.752821 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:19.768926 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:19.768875 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" podStartSLOduration=0.768857041 podStartE2EDuration="768.857041ms" podCreationTimestamp="2026-04-21 10:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:34:19.76748459 +0000 UTC m=+1828.111934475" watchObservedRunningTime="2026-04-21 10:34:19.768857041 +0000 UTC m=+1828.113306927" Apr 21 10:34:25.762716 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:25.762683 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:29.253793 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:29.253764 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm"] Apr 21 10:34:29.254159 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:29.253941 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" podUID="61f423d3-2602-46ca-b273-f3a8143807b0" containerName="splitter-graph-e8429" containerID="cri-o://972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1" gracePeriod=30 Apr 21 10:34:30.760250 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:30.760213 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" podUID="61f423d3-2602-46ca-b273-f3a8143807b0" containerName="splitter-graph-e8429" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:34:35.760764 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:35.760729 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" podUID="61f423d3-2602-46ca-b273-f3a8143807b0" containerName="splitter-graph-e8429" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:34:40.760961 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:40.760924 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" podUID="61f423d3-2602-46ca-b273-f3a8143807b0" containerName="splitter-graph-e8429" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:34:40.761379 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:40.761022 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:44.944871 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:44.944835 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m"] Apr 21 10:34:44.948415 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:44.948395 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:34:44.950613 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:44.950591 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b394f-serving-cert\"" Apr 21 10:34:44.950717 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:44.950702 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-b394f-kube-rbac-proxy-sar-config\"" Apr 21 10:34:44.957704 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:44.957679 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m"] Apr 21 10:34:45.108028 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.107998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0d4efc8-538e-415c-930e-dc58aa54450f-proxy-tls\") pod \"switch-graph-b394f-7475f7f5f8-xt62m\" (UID: \"e0d4efc8-538e-415c-930e-dc58aa54450f\") " pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:34:45.108200 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.108063 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0d4efc8-538e-415c-930e-dc58aa54450f-openshift-service-ca-bundle\") pod \"switch-graph-b394f-7475f7f5f8-xt62m\" (UID: \"e0d4efc8-538e-415c-930e-dc58aa54450f\") " pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:34:45.209147 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.209062 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0d4efc8-538e-415c-930e-dc58aa54450f-proxy-tls\") pod \"switch-graph-b394f-7475f7f5f8-xt62m\" (UID: \"e0d4efc8-538e-415c-930e-dc58aa54450f\") " pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:34:45.209273 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.209149 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0d4efc8-538e-415c-930e-dc58aa54450f-openshift-service-ca-bundle\") pod \"switch-graph-b394f-7475f7f5f8-xt62m\" (UID: \"e0d4efc8-538e-415c-930e-dc58aa54450f\") " pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:34:45.209742 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.209719 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0d4efc8-538e-415c-930e-dc58aa54450f-openshift-service-ca-bundle\") pod \"switch-graph-b394f-7475f7f5f8-xt62m\" (UID: \"e0d4efc8-538e-415c-930e-dc58aa54450f\") " pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:34:45.211422 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.211398 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0d4efc8-538e-415c-930e-dc58aa54450f-proxy-tls\") pod \"switch-graph-b394f-7475f7f5f8-xt62m\" (UID: \"e0d4efc8-538e-415c-930e-dc58aa54450f\") " pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:34:45.258619 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.258592 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:34:45.374927 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.374891 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m"] Apr 21 10:34:45.377124 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:34:45.377086 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0d4efc8_538e_415c_930e_dc58aa54450f.slice/crio-9fe8c18d37ea4ab3de09bea85907a340f3928cb6153b269abf8aa8f4979b6320 WatchSource:0}: Error finding container 9fe8c18d37ea4ab3de09bea85907a340f3928cb6153b269abf8aa8f4979b6320: Status 404 returned error can't find the container with id 9fe8c18d37ea4ab3de09bea85907a340f3928cb6153b269abf8aa8f4979b6320 Apr 21 10:34:45.760069 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.760035 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" podUID="61f423d3-2602-46ca-b273-f3a8143807b0" containerName="splitter-graph-e8429" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:34:45.831466 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.831436 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" event={"ID":"e0d4efc8-538e-415c-930e-dc58aa54450f","Type":"ContainerStarted","Data":"776452a8f8c2ee03ac91e3f8f2ccf25c7ed1ec87890a24b876afc806274c0c79"} Apr 21 10:34:45.831466 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.831469 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" event={"ID":"e0d4efc8-538e-415c-930e-dc58aa54450f","Type":"ContainerStarted","Data":"9fe8c18d37ea4ab3de09bea85907a340f3928cb6153b269abf8aa8f4979b6320"} Apr 21 10:34:45.831696 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.831490 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:34:45.846350 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:45.846297 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" podStartSLOduration=1.8462808320000001 podStartE2EDuration="1.846280832s" podCreationTimestamp="2026-04-21 10:34:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:34:45.846273088 +0000 UTC m=+1854.190722974" watchObservedRunningTime="2026-04-21 10:34:45.846280832 +0000 UTC m=+1854.190730718" Apr 21 10:34:50.760941 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:50.760902 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" podUID="61f423d3-2602-46ca-b273-f3a8143807b0" containerName="splitter-graph-e8429" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:34:51.840044 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:51.840010 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:34:55.760055 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:55.760018 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" podUID="61f423d3-2602-46ca-b273-f3a8143807b0" containerName="splitter-graph-e8429" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:34:59.388777 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.388754 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:59.424177 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.424156 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61f423d3-2602-46ca-b273-f3a8143807b0-openshift-service-ca-bundle\") pod \"61f423d3-2602-46ca-b273-f3a8143807b0\" (UID: \"61f423d3-2602-46ca-b273-f3a8143807b0\") " Apr 21 10:34:59.424296 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.424191 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61f423d3-2602-46ca-b273-f3a8143807b0-proxy-tls\") pod \"61f423d3-2602-46ca-b273-f3a8143807b0\" (UID: \"61f423d3-2602-46ca-b273-f3a8143807b0\") " Apr 21 10:34:59.424527 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.424502 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f423d3-2602-46ca-b273-f3a8143807b0-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "61f423d3-2602-46ca-b273-f3a8143807b0" (UID: "61f423d3-2602-46ca-b273-f3a8143807b0"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:34:59.426098 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.426074 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f423d3-2602-46ca-b273-f3a8143807b0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "61f423d3-2602-46ca-b273-f3a8143807b0" (UID: "61f423d3-2602-46ca-b273-f3a8143807b0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:34:59.525410 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.525340 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61f423d3-2602-46ca-b273-f3a8143807b0-openshift-service-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:34:59.525410 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.525365 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61f423d3-2602-46ca-b273-f3a8143807b0-proxy-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:34:59.874129 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.874080 2567 generic.go:358] "Generic (PLEG): container finished" podID="61f423d3-2602-46ca-b273-f3a8143807b0" containerID="972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1" exitCode=137 Apr 21 10:34:59.874288 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.874136 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" event={"ID":"61f423d3-2602-46ca-b273-f3a8143807b0","Type":"ContainerDied","Data":"972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1"} Apr 21 10:34:59.874288 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.874159 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" Apr 21 10:34:59.874288 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.874180 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm" event={"ID":"61f423d3-2602-46ca-b273-f3a8143807b0","Type":"ContainerDied","Data":"6682a2085eb04abcc301850eff8d17f3926f7913a353d3b155faf95d9610e568"} Apr 21 10:34:59.874288 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.874197 2567 scope.go:117] "RemoveContainer" containerID="972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1" Apr 21 10:34:59.881890 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.881874 2567 scope.go:117] "RemoveContainer" containerID="972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1" Apr 21 10:34:59.882177 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:34:59.882157 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1\": container with ID starting with 972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1 not found: ID does not exist" containerID="972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1" Apr 21 10:34:59.882250 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.882184 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1"} err="failed to get container status \"972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1\": rpc error: code = NotFound desc = could not find container \"972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1\": container with ID starting with 972c1755f383f3dcff2c31fedc10166b27e461c7160c0a0b906e40ac13ed5aa1 not found: ID does not exist" Apr 21 10:34:59.895204 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.895182 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm"] Apr 21 10:34:59.900103 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:34:59.900080 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e8429-56f48fd547-2nhfm"] Apr 21 10:35:00.256928 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:00.256852 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f423d3-2602-46ca-b273-f3a8143807b0" path="/var/lib/kubelet/pods/61f423d3-2602-46ca-b273-f3a8143807b0/volumes" Apr 21 10:35:39.504927 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.504891 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd"] Apr 21 10:35:39.505397 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.505207 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61f423d3-2602-46ca-b273-f3a8143807b0" containerName="splitter-graph-e8429" Apr 21 10:35:39.505397 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.505218 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f423d3-2602-46ca-b273-f3a8143807b0" containerName="splitter-graph-e8429" Apr 21 10:35:39.505397 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.505279 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="61f423d3-2602-46ca-b273-f3a8143807b0" containerName="splitter-graph-e8429" Apr 21 10:35:39.508183 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.508167 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:35:39.510573 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.510547 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-932d4-kube-rbac-proxy-sar-config\"" Apr 21 10:35:39.510573 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.510564 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-932d4-serving-cert\"" Apr 21 10:35:39.515507 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.515485 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd"] Apr 21 10:35:39.642991 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.642961 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60239094-1c2d-47d9-ab7f-628520328383-openshift-service-ca-bundle\") pod \"splitter-graph-932d4-6ffb546d98-w74vd\" (UID: \"60239094-1c2d-47d9-ab7f-628520328383\") " pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:35:39.643166 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.643047 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60239094-1c2d-47d9-ab7f-628520328383-proxy-tls\") pod \"splitter-graph-932d4-6ffb546d98-w74vd\" (UID: \"60239094-1c2d-47d9-ab7f-628520328383\") " pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:35:39.743923 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.743895 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60239094-1c2d-47d9-ab7f-628520328383-proxy-tls\") pod \"splitter-graph-932d4-6ffb546d98-w74vd\" (UID: \"60239094-1c2d-47d9-ab7f-628520328383\") " pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:35:39.744048 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.743934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60239094-1c2d-47d9-ab7f-628520328383-openshift-service-ca-bundle\") pod \"splitter-graph-932d4-6ffb546d98-w74vd\" (UID: \"60239094-1c2d-47d9-ab7f-628520328383\") " pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:35:39.744048 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:35:39.744033 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-932d4-serving-cert: secret "splitter-graph-932d4-serving-cert" not found Apr 21 10:35:39.744135 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:35:39.744099 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60239094-1c2d-47d9-ab7f-628520328383-proxy-tls podName:60239094-1c2d-47d9-ab7f-628520328383 nodeName:}" failed. No retries permitted until 2026-04-21 10:35:40.244083947 +0000 UTC m=+1908.588533815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/60239094-1c2d-47d9-ab7f-628520328383-proxy-tls") pod "splitter-graph-932d4-6ffb546d98-w74vd" (UID: "60239094-1c2d-47d9-ab7f-628520328383") : secret "splitter-graph-932d4-serving-cert" not found Apr 21 10:35:39.744544 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:39.744528 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60239094-1c2d-47d9-ab7f-628520328383-openshift-service-ca-bundle\") pod \"splitter-graph-932d4-6ffb546d98-w74vd\" (UID: \"60239094-1c2d-47d9-ab7f-628520328383\") " pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:35:40.248958 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:40.248927 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60239094-1c2d-47d9-ab7f-628520328383-proxy-tls\") pod \"splitter-graph-932d4-6ffb546d98-w74vd\" (UID: \"60239094-1c2d-47d9-ab7f-628520328383\") " pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:35:40.251101 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:40.251084 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60239094-1c2d-47d9-ab7f-628520328383-proxy-tls\") pod \"splitter-graph-932d4-6ffb546d98-w74vd\" (UID: \"60239094-1c2d-47d9-ab7f-628520328383\") " pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:35:40.418976 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:40.418948 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:35:40.531664 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:40.531639 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd"] Apr 21 10:35:40.534482 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:35:40.534455 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60239094_1c2d_47d9_ab7f_628520328383.slice/crio-760e60abdd5877f1ac5a26956419c73a7a60bb772435af823862178559731f8d WatchSource:0}: Error finding container 760e60abdd5877f1ac5a26956419c73a7a60bb772435af823862178559731f8d: Status 404 returned error can't find the container with id 760e60abdd5877f1ac5a26956419c73a7a60bb772435af823862178559731f8d Apr 21 10:35:41.001941 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:41.001843 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" event={"ID":"60239094-1c2d-47d9-ab7f-628520328383","Type":"ContainerStarted","Data":"d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf"} Apr 21 10:35:41.001941 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:41.001885 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" event={"ID":"60239094-1c2d-47d9-ab7f-628520328383","Type":"ContainerStarted","Data":"760e60abdd5877f1ac5a26956419c73a7a60bb772435af823862178559731f8d"} Apr 21 10:35:41.001941 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:41.001907 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:35:41.018273 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:41.018229 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" podStartSLOduration=2.01821444 podStartE2EDuration="2.01821444s" podCreationTimestamp="2026-04-21 10:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:35:41.017741937 +0000 UTC m=+1909.362191823" watchObservedRunningTime="2026-04-21 10:35:41.01821444 +0000 UTC m=+1909.362664326" Apr 21 10:35:47.009446 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:35:47.009412 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:38:52.242554 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:38:52.242525 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:38:52.245553 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:38:52.245531 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:43:52.264324 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:43:52.264212 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:43:52.268165 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:43:52.267921 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:43:54.225257 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:43:54.225217 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd"] Apr 21 10:43:54.225696 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:43:54.225508 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" podUID="60239094-1c2d-47d9-ab7f-628520328383" containerName="splitter-graph-932d4" containerID="cri-o://d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf" gracePeriod=30 Apr 21 10:43:57.008450 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:43:57.008415 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" podUID="60239094-1c2d-47d9-ab7f-628520328383" containerName="splitter-graph-932d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:44:02.009214 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:02.009163 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" podUID="60239094-1c2d-47d9-ab7f-628520328383" containerName="splitter-graph-932d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:44:07.008740 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:07.008704 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" podUID="60239094-1c2d-47d9-ab7f-628520328383" containerName="splitter-graph-932d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:44:07.009215 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:07.008797 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:44:12.008731 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:12.008648 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" podUID="60239094-1c2d-47d9-ab7f-628520328383" containerName="splitter-graph-932d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:44:17.008440 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:17.008397 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" podUID="60239094-1c2d-47d9-ab7f-628520328383" containerName="splitter-graph-932d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:44:22.008626 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:22.008589 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" podUID="60239094-1c2d-47d9-ab7f-628520328383" containerName="splitter-graph-932d4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:44:24.398850 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.398818 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:44:24.421780 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.421754 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60239094-1c2d-47d9-ab7f-628520328383-proxy-tls\") pod \"60239094-1c2d-47d9-ab7f-628520328383\" (UID: \"60239094-1c2d-47d9-ab7f-628520328383\") " Apr 21 10:44:24.422492 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.421835 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60239094-1c2d-47d9-ab7f-628520328383-openshift-service-ca-bundle\") pod \"60239094-1c2d-47d9-ab7f-628520328383\" (UID: \"60239094-1c2d-47d9-ab7f-628520328383\") " Apr 21 10:44:24.422492 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.422221 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60239094-1c2d-47d9-ab7f-628520328383-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "60239094-1c2d-47d9-ab7f-628520328383" (UID: "60239094-1c2d-47d9-ab7f-628520328383"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:44:24.423991 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.423966 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60239094-1c2d-47d9-ab7f-628520328383-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "60239094-1c2d-47d9-ab7f-628520328383" (UID: "60239094-1c2d-47d9-ab7f-628520328383"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:44:24.494516 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.494449 2567 generic.go:358] "Generic (PLEG): container finished" podID="60239094-1c2d-47d9-ab7f-628520328383" containerID="d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf" exitCode=0 Apr 21 10:44:24.494620 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.494513 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" Apr 21 10:44:24.494620 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.494517 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" event={"ID":"60239094-1c2d-47d9-ab7f-628520328383","Type":"ContainerDied","Data":"d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf"} Apr 21 10:44:24.494620 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.494613 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd" event={"ID":"60239094-1c2d-47d9-ab7f-628520328383","Type":"ContainerDied","Data":"760e60abdd5877f1ac5a26956419c73a7a60bb772435af823862178559731f8d"} Apr 21 10:44:24.494738 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.494627 2567 scope.go:117] "RemoveContainer" containerID="d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf" Apr 21 10:44:24.503314 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.503294 2567 scope.go:117] "RemoveContainer" containerID="d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf" Apr 21 10:44:24.503556 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:44:24.503539 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf\": container with ID starting with d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf not found: ID does not exist" containerID="d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf" Apr 21 10:44:24.503604 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.503565 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf"} err="failed to get container status \"d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf\": rpc error: code = NotFound desc = could not find container \"d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf\": container with ID starting with d4a244e076f907570f8d72c05aee12c35e0d567c5de1ccfe8b116d4bd247eebf not found: ID does not exist" Apr 21 10:44:24.516269 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.516249 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd"] Apr 21 10:44:24.520387 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.520369 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-932d4-6ffb546d98-w74vd"] Apr 21 10:44:24.522929 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.522915 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60239094-1c2d-47d9-ab7f-628520328383-openshift-service-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:44:24.522978 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:24.522932 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/60239094-1c2d-47d9-ab7f-628520328383-proxy-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:44:26.257133 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:44:26.257090 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60239094-1c2d-47d9-ab7f-628520328383" path="/var/lib/kubelet/pods/60239094-1c2d-47d9-ab7f-628520328383/volumes" Apr 21 10:48:52.284172 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:48:52.284142 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:48:52.288968 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:48:52.288947 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:51:04.222795 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:04.222764 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m"] Apr 21 10:51:04.223313 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:04.222985 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" podUID="e0d4efc8-538e-415c-930e-dc58aa54450f" containerName="switch-graph-b394f" containerID="cri-o://776452a8f8c2ee03ac91e3f8f2ccf25c7ed1ec87890a24b876afc806274c0c79" gracePeriod=30 Apr 21 10:51:05.484127 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.484089 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7wncp/must-gather-sh7vn"] Apr 21 10:51:05.484490 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.484416 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60239094-1c2d-47d9-ab7f-628520328383" containerName="splitter-graph-932d4" Apr 21 10:51:05.484490 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.484427 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="60239094-1c2d-47d9-ab7f-628520328383" containerName="splitter-graph-932d4" Apr 21 10:51:05.484490 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.484476 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="60239094-1c2d-47d9-ab7f-628520328383" containerName="splitter-graph-932d4" Apr 21 10:51:05.487546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.487529 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wncp/must-gather-sh7vn" Apr 21 10:51:05.490233 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.490207 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7wncp\"/\"kube-root-ca.crt\"" Apr 21 10:51:05.491141 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.491098 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7wncp\"/\"default-dockercfg-d7pbp\"" Apr 21 10:51:05.491251 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.491154 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7wncp\"/\"openshift-service-ca.crt\"" Apr 21 10:51:05.524140 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.524100 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7wncp/must-gather-sh7vn"] Apr 21 10:51:05.627754 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.627723 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fe5213a2-0c0e-4161-b537-a0d242b0bab5-must-gather-output\") pod \"must-gather-sh7vn\" (UID: \"fe5213a2-0c0e-4161-b537-a0d242b0bab5\") " pod="openshift-must-gather-7wncp/must-gather-sh7vn" Apr 21 10:51:05.627918 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.627787 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr2lp\" (UniqueName: \"kubernetes.io/projected/fe5213a2-0c0e-4161-b537-a0d242b0bab5-kube-api-access-pr2lp\") pod \"must-gather-sh7vn\" (UID: \"fe5213a2-0c0e-4161-b537-a0d242b0bab5\") " pod="openshift-must-gather-7wncp/must-gather-sh7vn" Apr 21 10:51:05.728950 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.728920 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fe5213a2-0c0e-4161-b537-a0d242b0bab5-must-gather-output\") pod \"must-gather-sh7vn\" (UID: \"fe5213a2-0c0e-4161-b537-a0d242b0bab5\") " pod="openshift-must-gather-7wncp/must-gather-sh7vn" Apr 21 10:51:05.729096 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.728976 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pr2lp\" (UniqueName: \"kubernetes.io/projected/fe5213a2-0c0e-4161-b537-a0d242b0bab5-kube-api-access-pr2lp\") pod \"must-gather-sh7vn\" (UID: \"fe5213a2-0c0e-4161-b537-a0d242b0bab5\") " pod="openshift-must-gather-7wncp/must-gather-sh7vn" Apr 21 10:51:05.729262 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.729241 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fe5213a2-0c0e-4161-b537-a0d242b0bab5-must-gather-output\") pod \"must-gather-sh7vn\" (UID: \"fe5213a2-0c0e-4161-b537-a0d242b0bab5\") " pod="openshift-must-gather-7wncp/must-gather-sh7vn" Apr 21 10:51:05.737227 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.737173 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr2lp\" (UniqueName: \"kubernetes.io/projected/fe5213a2-0c0e-4161-b537-a0d242b0bab5-kube-api-access-pr2lp\") pod \"must-gather-sh7vn\" (UID: \"fe5213a2-0c0e-4161-b537-a0d242b0bab5\") " pod="openshift-must-gather-7wncp/must-gather-sh7vn" Apr 21 10:51:05.805392 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.805364 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wncp/must-gather-sh7vn" Apr 21 10:51:05.923578 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.923550 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7wncp/must-gather-sh7vn"] Apr 21 10:51:05.925907 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:51:05.925877 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe5213a2_0c0e_4161_b537_a0d242b0bab5.slice/crio-b8575c3ff57e6914329a46d4b680d507e6e834c42d005541f2276e3f93d965cf WatchSource:0}: Error finding container b8575c3ff57e6914329a46d4b680d507e6e834c42d005541f2276e3f93d965cf: Status 404 returned error can't find the container with id b8575c3ff57e6914329a46d4b680d507e6e834c42d005541f2276e3f93d965cf Apr 21 10:51:05.927552 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:05.927537 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:51:06.643456 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:06.643420 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wncp/must-gather-sh7vn" event={"ID":"fe5213a2-0c0e-4161-b537-a0d242b0bab5","Type":"ContainerStarted","Data":"b8575c3ff57e6914329a46d4b680d507e6e834c42d005541f2276e3f93d965cf"} Apr 21 10:51:06.841700 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:06.841435 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" podUID="e0d4efc8-538e-415c-930e-dc58aa54450f" containerName="switch-graph-b394f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:51:10.660322 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:10.660287 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wncp/must-gather-sh7vn" event={"ID":"fe5213a2-0c0e-4161-b537-a0d242b0bab5","Type":"ContainerStarted","Data":"d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f"} Apr 21 10:51:10.660706 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:10.660330 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wncp/must-gather-sh7vn" event={"ID":"fe5213a2-0c0e-4161-b537-a0d242b0bab5","Type":"ContainerStarted","Data":"25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f"} Apr 21 10:51:10.682478 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:10.682428 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7wncp/must-gather-sh7vn" podStartSLOduration=1.6867675370000002 podStartE2EDuration="5.682411828s" podCreationTimestamp="2026-04-21 10:51:05 +0000 UTC" firstStartedPulling="2026-04-21 10:51:05.927655835 +0000 UTC m=+2834.272105697" lastFinishedPulling="2026-04-21 10:51:09.923300125 +0000 UTC m=+2838.267749988" observedRunningTime="2026-04-21 10:51:10.679296086 +0000 UTC m=+2839.023745965" watchObservedRunningTime="2026-04-21 10:51:10.682411828 +0000 UTC m=+2839.026861712" Apr 21 10:51:11.838723 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:11.838676 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" podUID="e0d4efc8-538e-415c-930e-dc58aa54450f" containerName="switch-graph-b394f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:51:16.839294 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:16.839251 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" podUID="e0d4efc8-538e-415c-930e-dc58aa54450f" containerName="switch-graph-b394f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:51:16.839685 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:16.839367 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:51:18.766561 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:18.766525 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:19.496175 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:19.496139 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:20.232567 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:20.232538 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:20.941280 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:20.941251 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:21.651507 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:21.651459 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:21.838581 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:21.838540 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" podUID="e0d4efc8-538e-415c-930e-dc58aa54450f" containerName="switch-graph-b394f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:51:22.364258 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:22.364206 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:23.080556 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:23.080524 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:23.795898 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:23.795867 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:24.523564 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:24.523534 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:25.241957 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:25.241926 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:25.928631 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:25.928602 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:26.665353 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:26.665318 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-b394f-7475f7f5f8-xt62m_e0d4efc8-538e-415c-930e-dc58aa54450f/switch-graph-b394f/0.log" Apr 21 10:51:26.839205 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:26.839158 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" podUID="e0d4efc8-538e-415c-930e-dc58aa54450f" containerName="switch-graph-b394f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:51:28.720753 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:28.720670 2567 generic.go:358] "Generic (PLEG): container finished" podID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" containerID="25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f" exitCode=0 Apr 21 10:51:28.721169 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:28.720743 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wncp/must-gather-sh7vn" event={"ID":"fe5213a2-0c0e-4161-b537-a0d242b0bab5","Type":"ContainerDied","Data":"25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f"} Apr 21 10:51:28.721169 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:28.721087 2567 scope.go:117] "RemoveContainer" containerID="25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f" Apr 21 10:51:29.472394 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:29.472362 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7wncp_must-gather-sh7vn_fe5213a2-0c0e-4161-b537-a0d242b0bab5/gather/0.log" Apr 21 10:51:31.837871 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:31.837828 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" podUID="e0d4efc8-538e-415c-930e-dc58aa54450f" containerName="switch-graph-b394f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:51:32.921749 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:32.921719 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kknsw_66ec4bf1-14e6-42c4-9174-6a6f20406a1c/global-pull-secret-syncer/0.log" Apr 21 10:51:33.105597 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:33.105561 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vr484_98faf1b6-99bb-4f34-822a-b471ba610d7d/konnectivity-agent/0.log" Apr 21 10:51:33.134946 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:33.134918 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-84.ec2.internal_366ead18e7fb69eb4a529be7bbd9f14e/haproxy/0.log" Apr 21 10:51:34.738314 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:34.738282 2567 generic.go:358] "Generic (PLEG): container finished" podID="e0d4efc8-538e-415c-930e-dc58aa54450f" containerID="776452a8f8c2ee03ac91e3f8f2ccf25c7ed1ec87890a24b876afc806274c0c79" exitCode=0 Apr 21 10:51:34.738658 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:34.738354 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" event={"ID":"e0d4efc8-538e-415c-930e-dc58aa54450f","Type":"ContainerDied","Data":"776452a8f8c2ee03ac91e3f8f2ccf25c7ed1ec87890a24b876afc806274c0c79"} Apr 21 10:51:34.858386 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:34.858365 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:51:34.882805 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:34.882776 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7wncp/must-gather-sh7vn"] Apr 21 10:51:34.883004 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:34.882983 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-7wncp/must-gather-sh7vn" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" containerName="copy" containerID="cri-o://d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f" gracePeriod=2 Apr 21 10:51:34.887601 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:34.887536 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7wncp/must-gather-sh7vn"] Apr 21 10:51:34.962137 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:34.962098 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0d4efc8-538e-415c-930e-dc58aa54450f-openshift-service-ca-bundle\") pod \"e0d4efc8-538e-415c-930e-dc58aa54450f\" (UID: \"e0d4efc8-538e-415c-930e-dc58aa54450f\") " Apr 21 10:51:34.962255 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:34.962197 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0d4efc8-538e-415c-930e-dc58aa54450f-proxy-tls\") pod \"e0d4efc8-538e-415c-930e-dc58aa54450f\" (UID: \"e0d4efc8-538e-415c-930e-dc58aa54450f\") " Apr 21 10:51:34.962412 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:34.962392 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0d4efc8-538e-415c-930e-dc58aa54450f-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "e0d4efc8-538e-415c-930e-dc58aa54450f" (UID: "e0d4efc8-538e-415c-930e-dc58aa54450f"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:51:34.964186 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:34.964162 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0d4efc8-538e-415c-930e-dc58aa54450f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e0d4efc8-538e-415c-930e-dc58aa54450f" (UID: "e0d4efc8-538e-415c-930e-dc58aa54450f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:51:35.063659 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.063618 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0d4efc8-538e-415c-930e-dc58aa54450f-proxy-tls\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:51:35.063659 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.063661 2567 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0d4efc8-538e-415c-930e-dc58aa54450f-openshift-service-ca-bundle\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:51:35.099315 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.099294 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7wncp_must-gather-sh7vn_fe5213a2-0c0e-4161-b537-a0d242b0bab5/copy/0.log" Apr 21 10:51:35.099663 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.099646 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wncp/must-gather-sh7vn" Apr 21 10:51:35.101694 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.101669 2567 status_manager.go:895] "Failed to get status for pod" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" pod="openshift-must-gather-7wncp/must-gather-sh7vn" err="pods \"must-gather-sh7vn\" is forbidden: User \"system:node:ip-10-0-129-84.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7wncp\": no relationship found between node 'ip-10-0-129-84.ec2.internal' and this object" Apr 21 10:51:35.164150 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.164097 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fe5213a2-0c0e-4161-b537-a0d242b0bab5-must-gather-output\") pod \"fe5213a2-0c0e-4161-b537-a0d242b0bab5\" (UID: \"fe5213a2-0c0e-4161-b537-a0d242b0bab5\") " Apr 21 10:51:35.164249 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.164185 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr2lp\" (UniqueName: \"kubernetes.io/projected/fe5213a2-0c0e-4161-b537-a0d242b0bab5-kube-api-access-pr2lp\") pod \"fe5213a2-0c0e-4161-b537-a0d242b0bab5\" (UID: \"fe5213a2-0c0e-4161-b537-a0d242b0bab5\") " Apr 21 10:51:35.165599 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.165578 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe5213a2-0c0e-4161-b537-a0d242b0bab5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "fe5213a2-0c0e-4161-b537-a0d242b0bab5" (UID: "fe5213a2-0c0e-4161-b537-a0d242b0bab5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:51:35.166181 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.166165 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe5213a2-0c0e-4161-b537-a0d242b0bab5-kube-api-access-pr2lp" (OuterVolumeSpecName: "kube-api-access-pr2lp") pod "fe5213a2-0c0e-4161-b537-a0d242b0bab5" (UID: "fe5213a2-0c0e-4161-b537-a0d242b0bab5"). InnerVolumeSpecName "kube-api-access-pr2lp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:51:35.265634 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.265575 2567 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/fe5213a2-0c0e-4161-b537-a0d242b0bab5-must-gather-output\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:51:35.265634 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.265602 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pr2lp\" (UniqueName: \"kubernetes.io/projected/fe5213a2-0c0e-4161-b537-a0d242b0bab5-kube-api-access-pr2lp\") on node \"ip-10-0-129-84.ec2.internal\" DevicePath \"\"" Apr 21 10:51:35.742845 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.742807 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" event={"ID":"e0d4efc8-538e-415c-930e-dc58aa54450f","Type":"ContainerDied","Data":"9fe8c18d37ea4ab3de09bea85907a340f3928cb6153b269abf8aa8f4979b6320"} Apr 21 10:51:35.742845 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.742817 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m" Apr 21 10:51:35.743321 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.742852 2567 scope.go:117] "RemoveContainer" containerID="776452a8f8c2ee03ac91e3f8f2ccf25c7ed1ec87890a24b876afc806274c0c79" Apr 21 10:51:35.744095 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.744026 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7wncp_must-gather-sh7vn_fe5213a2-0c0e-4161-b537-a0d242b0bab5/copy/0.log" Apr 21 10:51:35.744406 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.744381 2567 generic.go:358] "Generic (PLEG): container finished" podID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" containerID="d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f" exitCode=143 Apr 21 10:51:35.744505 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.744428 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wncp/must-gather-sh7vn" Apr 21 10:51:35.748887 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.748861 2567 status_manager.go:895] "Failed to get status for pod" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" pod="openshift-must-gather-7wncp/must-gather-sh7vn" err="pods \"must-gather-sh7vn\" is forbidden: User \"system:node:ip-10-0-129-84.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7wncp\": no relationship found between node 'ip-10-0-129-84.ec2.internal' and this object" Apr 21 10:51:35.750742 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.750717 2567 status_manager.go:895] "Failed to get status for pod" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" pod="openshift-must-gather-7wncp/must-gather-sh7vn" err="pods \"must-gather-sh7vn\" is forbidden: User \"system:node:ip-10-0-129-84.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7wncp\": no relationship found between node 'ip-10-0-129-84.ec2.internal' and this object" Apr 21 10:51:35.751698 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.751679 2567 scope.go:117] "RemoveContainer" containerID="d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f" Apr 21 10:51:35.758229 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.758213 2567 scope.go:117] "RemoveContainer" containerID="25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f" Apr 21 10:51:35.774244 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.774202 2567 scope.go:117] "RemoveContainer" containerID="d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f" Apr 21 10:51:35.774520 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:51:35.774499 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f\": container with ID starting with d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f not found: ID does not exist" containerID="d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f" Apr 21 10:51:35.774635 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.774528 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f"} err="failed to get container status \"d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f\": rpc error: code = NotFound desc = could not find container \"d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f\": container with ID starting with d942d939a04b1615a38ae259561d6c2d60ec15be345128123e49b5dfb9bb6a8f not found: ID does not exist" Apr 21 10:51:35.774635 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.774546 2567 scope.go:117] "RemoveContainer" containerID="25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f" Apr 21 10:51:35.774902 ip-10-0-129-84 kubenswrapper[2567]: E0421 10:51:35.774876 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f\": container with ID starting with 25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f not found: ID does not exist" containerID="25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f" Apr 21 10:51:35.775234 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.775067 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f"} err="failed to get container status \"25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f\": rpc error: code = NotFound desc = could not find container \"25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f\": container with ID starting with 25f08989dfa2717048da936029a0f81150ace036cc19bc0bc7704682758dcd1f not found: ID does not exist" Apr 21 10:51:35.776038 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.776019 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m"] Apr 21 10:51:35.779546 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.779527 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-b394f-7475f7f5f8-xt62m"] Apr 21 10:51:35.780015 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.779997 2567 status_manager.go:895] "Failed to get status for pod" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" pod="openshift-must-gather-7wncp/must-gather-sh7vn" err="pods \"must-gather-sh7vn\" is forbidden: User \"system:node:ip-10-0-129-84.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7wncp\": no relationship found between node 'ip-10-0-129-84.ec2.internal' and this object" Apr 21 10:51:35.781703 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:35.781686 2567 status_manager.go:895] "Failed to get status for pod" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" pod="openshift-must-gather-7wncp/must-gather-sh7vn" err="pods \"must-gather-sh7vn\" is forbidden: User \"system:node:ip-10-0-129-84.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7wncp\": no relationship found between node 'ip-10-0-129-84.ec2.internal' and this object" Apr 21 10:51:36.256919 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.256883 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d4efc8-538e-415c-930e-dc58aa54450f" path="/var/lib/kubelet/pods/e0d4efc8-538e-415c-930e-dc58aa54450f/volumes" Apr 21 10:51:36.257250 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.257237 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" path="/var/lib/kubelet/pods/fe5213a2-0c0e-4161-b537-a0d242b0bab5/volumes" Apr 21 10:51:36.352267 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.352244 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_abd28c2e-ea3c-490a-8407-1bf197a81d99/alertmanager/0.log" Apr 21 10:51:36.378200 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.378174 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_abd28c2e-ea3c-490a-8407-1bf197a81d99/config-reloader/0.log" Apr 21 10:51:36.399172 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.399151 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_abd28c2e-ea3c-490a-8407-1bf197a81d99/kube-rbac-proxy-web/0.log" Apr 21 10:51:36.419248 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.419227 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_abd28c2e-ea3c-490a-8407-1bf197a81d99/kube-rbac-proxy/0.log" Apr 21 10:51:36.441125 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.441085 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_abd28c2e-ea3c-490a-8407-1bf197a81d99/kube-rbac-proxy-metric/0.log" Apr 21 10:51:36.459990 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.459971 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_abd28c2e-ea3c-490a-8407-1bf197a81d99/prom-label-proxy/0.log" Apr 21 10:51:36.480566 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.480546 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_abd28c2e-ea3c-490a-8407-1bf197a81d99/init-config-reloader/0.log" Apr 21 10:51:36.633252 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.633228 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5747f67664-2t59r_60918ad2-789d-42c3-ae43-a03117a42abc/metrics-server/0.log" Apr 21 10:51:36.660355 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.660334 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-j7fvv_f2b092e0-5084-4ac8-a541-ebb66bf667a2/monitoring-plugin/0.log" Apr 21 10:51:36.696309 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.696285 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2hxkg_68aa0fc3-11c0-423b-84c6-5f7b2c07e131/node-exporter/0.log" Apr 21 10:51:36.726088 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.726066 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2hxkg_68aa0fc3-11c0-423b-84c6-5f7b2c07e131/kube-rbac-proxy/0.log" Apr 21 10:51:36.750397 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.750377 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2hxkg_68aa0fc3-11c0-423b-84c6-5f7b2c07e131/init-textfile/0.log" Apr 21 10:51:36.934941 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.934882 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-92xmm_d1a6954c-23c1-4d88-b436-1b9885f0dfc3/kube-rbac-proxy-main/0.log" Apr 21 10:51:36.956911 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.956888 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-92xmm_d1a6954c-23c1-4d88-b436-1b9885f0dfc3/kube-rbac-proxy-self/0.log" Apr 21 10:51:36.980325 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:36.980279 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-92xmm_d1a6954c-23c1-4d88-b436-1b9885f0dfc3/openshift-state-metrics/0.log" Apr 21 10:51:37.028840 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:37.028820 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ab962edd-a3ba-4beb-aff2-b9311b2938aa/prometheus/0.log" Apr 21 10:51:37.045085 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:37.045066 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ab962edd-a3ba-4beb-aff2-b9311b2938aa/config-reloader/0.log" Apr 21 10:51:37.065165 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:37.065145 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ab962edd-a3ba-4beb-aff2-b9311b2938aa/thanos-sidecar/0.log" Apr 21 10:51:37.087548 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:37.087527 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ab962edd-a3ba-4beb-aff2-b9311b2938aa/kube-rbac-proxy-web/0.log" Apr 21 10:51:37.107877 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:37.107856 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ab962edd-a3ba-4beb-aff2-b9311b2938aa/kube-rbac-proxy/0.log" Apr 21 10:51:37.131138 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:37.131097 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ab962edd-a3ba-4beb-aff2-b9311b2938aa/kube-rbac-proxy-thanos/0.log" Apr 21 10:51:37.154415 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:37.154396 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ab962edd-a3ba-4beb-aff2-b9311b2938aa/init-config-reloader/0.log" Apr 21 10:51:37.265713 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:37.265646 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75fdf7498f-rzddk_cffa9bea-51ed-4ef6-a644-9598dc5dcc73/telemeter-client/0.log" Apr 21 10:51:37.287843 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:37.287825 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75fdf7498f-rzddk_cffa9bea-51ed-4ef6-a644-9598dc5dcc73/reload/0.log" Apr 21 10:51:37.307925 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:37.307898 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75fdf7498f-rzddk_cffa9bea-51ed-4ef6-a644-9598dc5dcc73/kube-rbac-proxy/0.log" Apr 21 10:51:40.155930 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.155893 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq"] Apr 21 10:51:40.156560 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.156539 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0d4efc8-538e-415c-930e-dc58aa54450f" containerName="switch-graph-b394f" Apr 21 10:51:40.156663 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.156564 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d4efc8-538e-415c-930e-dc58aa54450f" containerName="switch-graph-b394f" Apr 21 10:51:40.156663 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.156602 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" containerName="copy" Apr 21 10:51:40.156663 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.156611 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" containerName="copy" Apr 21 10:51:40.156663 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.156639 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" containerName="gather" Apr 21 10:51:40.156663 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.156648 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" containerName="gather" Apr 21 10:51:40.156903 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.156776 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0d4efc8-538e-415c-930e-dc58aa54450f" containerName="switch-graph-b394f" Apr 21 10:51:40.156903 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.156799 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" containerName="gather" Apr 21 10:51:40.156903 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.156811 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe5213a2-0c0e-4161-b537-a0d242b0bab5" containerName="copy" Apr 21 10:51:40.162944 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.162923 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.165718 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.165687 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-95dzh\"/\"openshift-service-ca.crt\"" Apr 21 10:51:40.165844 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.165718 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-95dzh\"/\"kube-root-ca.crt\"" Apr 21 10:51:40.166032 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.166016 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq"] Apr 21 10:51:40.166515 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.166493 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-95dzh\"/\"default-dockercfg-nrjgl\"" Apr 21 10:51:40.305697 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.305667 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-podres\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.305879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.305702 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-sys\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.305879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.305745 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-proc\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.305879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.305779 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-lib-modules\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.305879 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.305817 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frtfh\" (UniqueName: \"kubernetes.io/projected/d80ef461-66dd-4940-9b10-16e2b33e13ca-kube-api-access-frtfh\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.406999 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.406922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-proc\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.406999 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.406965 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-lib-modules\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.407206 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.407054 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-proc\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.407206 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.407076 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-lib-modules\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.407206 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.407080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frtfh\" (UniqueName: \"kubernetes.io/projected/d80ef461-66dd-4940-9b10-16e2b33e13ca-kube-api-access-frtfh\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.407206 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.407140 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-podres\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.407206 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.407168 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-sys\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.407418 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.407236 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-sys\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.407418 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.407308 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d80ef461-66dd-4940-9b10-16e2b33e13ca-podres\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.415082 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.415060 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frtfh\" (UniqueName: \"kubernetes.io/projected/d80ef461-66dd-4940-9b10-16e2b33e13ca-kube-api-access-frtfh\") pod \"perf-node-gather-daemonset-rxrzq\" (UID: \"d80ef461-66dd-4940-9b10-16e2b33e13ca\") " pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.437077 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.437059 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lzplb_f6d6b88a-d380-4539-b153-560938088617/dns/0.log" Apr 21 10:51:40.458895 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.458873 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lzplb_f6d6b88a-d380-4539-b153-560938088617/kube-rbac-proxy/0.log" Apr 21 10:51:40.473102 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.473085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.531687 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.531659 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-d9np2_277f3705-c93f-4057-bbda-a7f798e4406d/dns-node-resolver/0.log" Apr 21 10:51:40.594334 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.594277 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq"] Apr 21 10:51:40.597769 ip-10-0-129-84 kubenswrapper[2567]: W0421 10:51:40.597741 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd80ef461_66dd_4940_9b10_16e2b33e13ca.slice/crio-5796f6dc5d0e84421fd737f44c5483617d89eb681ddd507d4f1c842c68e1b291 WatchSource:0}: Error finding container 5796f6dc5d0e84421fd737f44c5483617d89eb681ddd507d4f1c842c68e1b291: Status 404 returned error can't find the container with id 5796f6dc5d0e84421fd737f44c5483617d89eb681ddd507d4f1c842c68e1b291 Apr 21 10:51:40.759937 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.759870 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" event={"ID":"d80ef461-66dd-4940-9b10-16e2b33e13ca","Type":"ContainerStarted","Data":"042898d5398dee4d9143b6a0d33f133c3d84c2db0476ebad9a941c1aa4584338"} Apr 21 10:51:40.759937 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.759908 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" event={"ID":"d80ef461-66dd-4940-9b10-16e2b33e13ca","Type":"ContainerStarted","Data":"5796f6dc5d0e84421fd737f44c5483617d89eb681ddd507d4f1c842c68e1b291"} Apr 21 10:51:40.760083 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.759995 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:40.778733 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:40.778691 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" podStartSLOduration=0.778677016 podStartE2EDuration="778.677016ms" podCreationTimestamp="2026-04-21 10:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:51:40.776382504 +0000 UTC m=+2869.120832389" watchObservedRunningTime="2026-04-21 10:51:40.778677016 +0000 UTC m=+2869.123126901" Apr 21 10:51:41.020054 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:41.019981 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7776896ff5-2q4gp_37210912-0c16-4531-be5a-e4e0262a5e52/registry/0.log" Apr 21 10:51:41.094241 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:41.094217 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j2lqn_3b4e60e7-4ce7-450a-83fc-fd1e09de64f9/node-ca/0.log" Apr 21 10:51:42.116258 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:42.116224 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-h8kbj_e89bee18-57f7-4cb6-9183-9ad08b859350/serve-healthcheck-canary/0.log" Apr 21 10:51:42.690339 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:42.690308 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x974r_095a4938-5652-428e-9f3f-d766898b0bab/kube-rbac-proxy/0.log" Apr 21 10:51:42.725996 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:42.725971 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x974r_095a4938-5652-428e-9f3f-d766898b0bab/exporter/0.log" Apr 21 10:51:42.763052 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:42.763030 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x974r_095a4938-5652-428e-9f3f-d766898b0bab/extractor/0.log" Apr 21 10:51:44.736639 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:44.736582 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-mm8cg_f70e23cc-b117-459b-9f7d-bfd08eaf9280/server/0.log" Apr 21 10:51:45.173889 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:45.173856 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-gg64n_ae7dc782-b5b4-40b6-9957-e0e724ebdbcd/manager/0.log" Apr 21 10:51:45.196456 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:45.196432 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-9hzb5_eb58351b-d299-4c51-aabe-7f3625ac6226/s3-init/0.log" Apr 21 10:51:46.772855 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:46.772823 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-95dzh/perf-node-gather-daemonset-rxrzq" Apr 21 10:51:50.198821 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:50.198749 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65267_31f3696c-a468-44ca-9299-c2d8a53166c8/kube-multus-additional-cni-plugins/0.log" Apr 21 10:51:50.217818 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:50.217795 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65267_31f3696c-a468-44ca-9299-c2d8a53166c8/egress-router-binary-copy/0.log" Apr 21 10:51:50.236584 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:50.236560 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65267_31f3696c-a468-44ca-9299-c2d8a53166c8/cni-plugins/0.log" Apr 21 10:51:50.255710 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:50.255687 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65267_31f3696c-a468-44ca-9299-c2d8a53166c8/bond-cni-plugin/0.log" Apr 21 10:51:50.273997 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:50.273977 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65267_31f3696c-a468-44ca-9299-c2d8a53166c8/routeoverride-cni/0.log" Apr 21 10:51:50.292173 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:50.292153 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65267_31f3696c-a468-44ca-9299-c2d8a53166c8/whereabouts-cni-bincopy/0.log" Apr 21 10:51:50.312156 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:50.312136 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-65267_31f3696c-a468-44ca-9299-c2d8a53166c8/whereabouts-cni/0.log" Apr 21 10:51:50.652288 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:50.652255 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kchl2_e2f35c99-2ff3-4631-9d48-aec797338383/kube-multus/0.log" Apr 21 10:51:50.792903 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:50.792870 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vrs72_1b8b33c9-4316-4863-843f-730d4490910b/network-metrics-daemon/0.log" Apr 21 10:51:50.809603 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:50.809573 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vrs72_1b8b33c9-4316-4863-843f-730d4490910b/kube-rbac-proxy/0.log" Apr 21 10:51:51.890224 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:51.890191 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-controller/0.log" Apr 21 10:51:51.911038 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:51.911013 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/0.log" Apr 21 10:51:51.935933 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:51.935906 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovn-acl-logging/1.log" Apr 21 10:51:51.953583 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:51.953558 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/kube-rbac-proxy-node/0.log" Apr 21 10:51:51.973415 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:51.973394 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 10:51:51.992140 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:51.992103 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/northd/0.log" Apr 21 10:51:52.013751 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:52.013732 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/nbdb/0.log" Apr 21 10:51:52.035761 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:52.035738 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/sbdb/0.log" Apr 21 10:51:52.232270 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:52.232199 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l2qr9_0d15ebaa-acf8-4b85-8a72-fdb57a04e985/ovnkube-controller/0.log" Apr 21 10:51:53.563214 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:53.563181 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-lf67k_8cb513a2-6bf9-465c-bac3-8b87096c0e4e/network-check-target-container/0.log" Apr 21 10:51:54.492702 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:54.492673 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-p7n2z_a304ded0-8dcb-4d07-b2f5-a18c53303a25/iptables-alerter/0.log" Apr 21 10:51:55.078172 ip-10-0-129-84 kubenswrapper[2567]: I0421 10:51:55.078132 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-z82s8_bd1a8429-25a8-49c9-9dda-7a286fbe2767/tuned/0.log"