Apr 23 17:52:35.066101 ip-10-0-130-202 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:52:35.545757 ip-10-0-130-202 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:52:35.545757 ip-10-0-130-202 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:52:35.545757 ip-10-0-130-202 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:52:35.545757 ip-10-0-130-202 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:52:35.545757 ip-10-0-130-202 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:52:35.547691 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.547607 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:52:35.552557 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552536 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:35.552557 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552554 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:35.552557 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552558 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:35.552557 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552561 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:35.552557 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552564 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:35.552557 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552567 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552570 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552573 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552577 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552579 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552584 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552588 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552592 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552595 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552598 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552600 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552603 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552606 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552608 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552611 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552613 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552616 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552618 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552621 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:35.552769 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552623 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552626 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552629 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552632 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552634 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552637 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552639 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552642 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552645 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552647 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552650 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552652 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552655 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552657 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552660 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552663 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552666 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552669 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552671 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552675 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:35.553295 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552677 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552680 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552682 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552685 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552687 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552690 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552692 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552695 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552697 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552700 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552703 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552706 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552708 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552710 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552713 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552715 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552718 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552721 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552723 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552726 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:35.553775 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552728 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552731 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552733 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552736 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552739 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552741 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552745 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552748 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552751 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552753 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552756 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552758 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552761 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552766 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552770 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552774 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552777 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552779 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552782 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:35.554275 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552785 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552788 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.552791 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553233 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553240 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553244 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553247 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553250 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553253 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553256 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553259 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553262 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553265 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553267 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553269 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553272 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553275 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553277 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553280 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:35.554725 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553283 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553285 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553289 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553291 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553294 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553297 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553299 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553302 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553305 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553308 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553311 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553313 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553316 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553318 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553321 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553323 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553326 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553328 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553331 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553334 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:35.555196 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553336 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553338 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553341 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553344 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553346 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553349 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553351 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553354 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553356 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553359 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553362 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553364 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553367 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553369 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553372 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553374 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553377 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553379 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553382 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553384 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:35.555686 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553387 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553390 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553392 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553395 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553397 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553400 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553402 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553405 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553407 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553410 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553413 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553415 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553417 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553420 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553422 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553426 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553430 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553432 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553435 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553438 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:35.556189 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553441 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553444 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553446 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553449 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553451 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553454 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553456 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553459 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553461 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.553464 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553531 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553538 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553544 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553550 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553557 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553562 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553569 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553576 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553580 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553583 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553586 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553589 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:52:35.556673 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553596 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553599 2579 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553602 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553605 2579 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553608 2579 flags.go:64] FLAG: --cloud-config="" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553610 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553613 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553618 2579 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553620 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553623 2579 flags.go:64] FLAG: --config-dir="" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553627 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553631 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553635 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553638 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553641 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553644 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553647 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553650 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553653 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553656 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553659 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553663 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553666 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553669 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553671 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:52:35.557258 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553675 2579 flags.go:64] FLAG: --enable-server="true" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553678 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553682 2579 flags.go:64] FLAG: --event-burst="100" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553685 2579 flags.go:64] FLAG: --event-qps="50" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553688 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553691 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553694 2579 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553699 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553702 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553705 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553708 2579 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553711 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553714 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553717 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553720 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553723 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553726 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553729 2579 flags.go:64] FLAG: --feature-gates="" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553733 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553736 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553739 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553742 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553745 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553748 2579 flags.go:64] FLAG: --help="false" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553751 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-130-202.ec2.internal" Apr 23 17:52:35.557864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553754 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553757 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553760 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553763 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553766 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553769 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553772 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553774 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553778 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553781 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553784 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553787 2579 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553790 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553793 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553797 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553800 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553803 2579 flags.go:64] FLAG: --lock-file="" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553806 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553808 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553811 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553816 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553819 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553834 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:52:35.558493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553837 2579 flags.go:64] FLAG: --logging-format="text" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553840 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553844 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553847 2579 flags.go:64] FLAG: --manifest-url="" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553850 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553855 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553858 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553862 2579 flags.go:64] FLAG: --max-pods="110" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553865 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553868 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553871 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553873 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553876 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553879 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553882 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553890 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553893 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553896 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553900 2579 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553903 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553908 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553911 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553914 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553919 2579 flags.go:64] FLAG: --port="10250" Apr 23 17:52:35.559071 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553922 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553925 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0299892dca09885a7" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553928 2579 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553931 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553934 2579 flags.go:64] FLAG: --register-node="true" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553936 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553939 2579 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553943 2579 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553946 2579 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553948 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553952 2579 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553955 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553958 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553961 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553964 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553967 2579 flags.go:64] FLAG: --runonce="false" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553970 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553973 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553976 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553979 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553981 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553984 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553987 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553990 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553993 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553996 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:52:35.559665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.553999 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554002 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554005 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554008 2579 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554010 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554020 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554023 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554025 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554030 2579 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554033 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554035 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554038 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554041 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554044 2579 flags.go:64] FLAG: --v="2" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554049 2579 flags.go:64] FLAG: --version="false" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554058 2579 flags.go:64] FLAG: --vmodule="" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554062 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554066 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554156 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554160 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554163 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554166 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554170 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:35.560313 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554173 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554176 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554178 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554181 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554184 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554186 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554189 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554191 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554194 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554197 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554199 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554202 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554205 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554207 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554211 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554214 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554216 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554219 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554222 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554225 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:35.560908 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554227 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554230 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554233 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554236 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554238 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554241 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554243 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554246 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554248 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554251 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554253 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554256 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554258 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554261 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554263 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554266 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554268 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554271 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554273 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554275 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:35.561425 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554278 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554281 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554283 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554286 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554289 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554292 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554296 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554298 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554301 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554303 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554306 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554309 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554312 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554314 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554317 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554321 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554324 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554327 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554330 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554332 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:35.561933 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554335 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554338 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554340 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554343 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554346 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554348 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554351 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554353 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554356 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554360 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554363 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554366 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554369 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554372 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554375 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554377 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554380 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554383 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554385 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:35.562418 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554388 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.554390 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.554993 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.562158 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.562174 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562224 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562229 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562233 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562236 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562239 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562242 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562245 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562249 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562252 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562255 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562258 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:35.562899 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562260 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562263 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562265 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562268 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562270 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562273 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562275 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562278 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562280 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562283 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562287 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562291 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562294 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562296 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562299 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562301 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562304 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562306 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562309 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:35.563321 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562311 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562315 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562318 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562320 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562323 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562325 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562328 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562330 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562333 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562335 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562338 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562341 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562343 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562346 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562348 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562351 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562353 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562355 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562358 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562361 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:35.563779 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562363 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562365 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562368 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562371 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562373 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562376 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562378 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562381 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562383 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562385 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562388 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562390 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562395 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562400 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562404 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562407 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562410 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562412 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562415 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562417 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:35.564304 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562420 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562423 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562425 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562428 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562431 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562434 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562436 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562439 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562441 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562444 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562446 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562449 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562452 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562454 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562456 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562459 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:35.564781 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.562464 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562585 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562590 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562594 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562596 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562599 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562602 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562604 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562607 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562610 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562613 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562616 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562620 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562624 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562627 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562629 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562632 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562634 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562637 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:52:35.565292 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562639 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562642 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562644 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562647 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562649 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562652 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562654 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562657 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562660 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562664 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562667 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562670 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562672 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562675 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562677 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562680 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562682 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562685 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562687 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562690 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:52:35.565757 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562692 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562694 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562697 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562700 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562703 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562706 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562709 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562711 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562713 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562716 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562718 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562721 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562723 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562726 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562728 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562731 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562733 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562735 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562738 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562740 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:52:35.566282 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562743 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562745 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562748 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562750 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562752 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562755 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562757 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562759 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562762 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562764 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562767 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562769 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562771 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562774 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562776 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562779 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562781 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562784 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562789 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562792 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:52:35.566804 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562794 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:52:35.567296 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562797 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:52:35.567296 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562799 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:52:35.567296 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562801 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:52:35.567296 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562804 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:52:35.567296 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562806 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:52:35.567296 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562808 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:52:35.567296 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:35.562811 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:52:35.567296 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.562816 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:52:35.567296 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.563497 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:52:35.567666 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.567652 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:52:35.568534 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.568522 2579 server.go:1019] "Starting client certificate rotation" Apr 23 17:52:35.568631 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.568615 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:52:35.568681 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.568650 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:52:35.592202 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.592184 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:52:35.594511 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.594483 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:52:35.613862 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.613841 2579 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:52:35.618963 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.618948 2579 log.go:25] "Validated CRI v1 image API" Apr 23 17:52:35.621073 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.621043 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:52:35.625040 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.625011 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b7d09a57-0b0c-44a4-b559-6de677316cc0:/dev/nvme0n1p3 b9dd92ad-4bed-4c3c-a042-68657de33bd2:/dev/nvme0n1p4] Apr 23 17:52:35.625040 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.625035 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:52:35.630633 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.630516 2579 manager.go:217] Machine: {Timestamp:2026-04-23 17:52:35.62891992 +0000 UTC m=+0.444285560 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3089623 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b8992acf15e1e5cf698b92200eddd SystemUUID:ec2b8992-acf1-5e1e-5cf6-98b92200eddd BootID:028a7371-f8b2-4153-92d0-e46052ee73d0 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:62:05:d1:29:23 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:62:05:d1:29:23 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:12:a5:0d:97:5e:ba Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:52:35.630633 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.630622 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:52:35.630756 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.630695 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:52:35.632249 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.632223 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:52:35.632386 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.632252 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-202.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:52:35.632433 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.632396 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:52:35.632433 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.632404 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:52:35.632433 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.632417 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:52:35.633034 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.633015 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:52:35.633964 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.633954 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:52:35.634075 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.634066 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:52:35.634763 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.634746 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:52:35.640499 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.640486 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:52:35.640542 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.640504 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:52:35.640542 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.640516 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:52:35.640542 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.640526 2579 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:52:35.640542 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.640534 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:52:35.641607 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.641596 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:52:35.641644 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.641614 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:52:35.643926 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.643910 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:52:35.645537 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.645511 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:52:35.646867 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646855 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:52:35.646910 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646873 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:52:35.646910 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646879 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:52:35.646910 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646885 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:52:35.646910 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646890 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:52:35.646910 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646896 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:52:35.646910 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646902 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:52:35.646910 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646908 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:52:35.646910 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646914 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:52:35.647111 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646921 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:52:35.647111 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646934 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:52:35.647111 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.646943 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:52:35.647894 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.647886 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:52:35.647930 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.647895 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:52:35.654929 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.654906 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:52:35.655019 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.654954 2579 server.go:1295] "Started kubelet" Apr 23 17:52:35.655812 ip-10-0-130-202 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:52:35.656099 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.656042 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:52:35.656181 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.656135 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:52:35.656573 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.656532 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:52:35.657065 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.657039 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-202.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:52:35.657065 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.657041 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:52:35.657213 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.656550 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-202.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:52:35.658667 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.658637 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f8gbt" Apr 23 17:52:35.659261 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.659242 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:52:35.659391 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.659379 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:52:35.661595 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.660785 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-202.ec2.internal.18a90dd9787d7949 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-202.ec2.internal,UID:ip-10-0-130-202.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-202.ec2.internal,},FirstTimestamp:2026-04-23 17:52:35.654924617 +0000 UTC m=+0.470290262,LastTimestamp:2026-04-23 17:52:35.654924617 +0000 UTC m=+0.470290262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-202.ec2.internal,}" Apr 23 17:52:35.665006 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.664986 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:52:35.665082 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.664991 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:52:35.665757 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.665739 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:35.665859 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.665806 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:52:35.666040 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.666020 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:52:35.666088 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.666046 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:52:35.666182 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.666169 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:52:35.666220 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.666184 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:52:35.666396 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.666368 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f8gbt" Apr 23 17:52:35.668171 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.668156 2579 factory.go:55] Registering systemd factory Apr 23 17:52:35.668171 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.668172 2579 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:52:35.668402 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.668391 2579 factory.go:153] Registering CRI-O factory Apr 23 17:52:35.668402 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.668402 2579 factory.go:223] Registration of the crio container factory successfully Apr 23 17:52:35.668468 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.668445 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:52:35.668498 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.668481 2579 factory.go:103] Registering Raw factory Apr 23 17:52:35.668498 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.668493 2579 manager.go:1196] Started watching for new ooms in manager Apr 23 17:52:35.668885 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.668867 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 17:52:35.669238 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.669139 2579 manager.go:319] Starting recovery of all containers Apr 23 17:52:35.673039 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.673016 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:52:35.675705 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.675685 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-202.ec2.internal\" not found" node="ip-10-0-130-202.ec2.internal" Apr 23 17:52:35.679925 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.679685 2579 manager.go:324] Recovery completed Apr 23 17:52:35.684149 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.684136 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:35.686240 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.686226 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:35.686301 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.686254 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:35.686301 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.686265 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:35.686771 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.686758 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:52:35.686771 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.686769 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:52:35.686865 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.686785 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:52:35.688938 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.688926 2579 policy_none.go:49] "None policy: Start" Apr 23 17:52:35.689000 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.688942 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:52:35.689000 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.688952 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:52:35.726452 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.726436 2579 manager.go:341] "Starting Device Plugin manager" Apr 23 17:52:35.730446 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.726538 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:52:35.730446 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.726555 2579 server.go:85] "Starting device plugin registration server" Apr 23 17:52:35.730446 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.726785 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:52:35.730446 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.726796 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:52:35.730446 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.726946 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:52:35.730446 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.727030 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:52:35.730446 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.727040 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:52:35.730446 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.727632 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:52:35.730446 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.727661 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:35.796626 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.796548 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:52:35.797956 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.797939 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:52:35.798037 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.797973 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:52:35.798037 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.797995 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:52:35.798037 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.798006 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:52:35.798147 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.798048 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:52:35.799987 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.799969 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:52:35.827056 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.827032 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:35.827880 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.827865 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:35.827959 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.827893 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:35.827959 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.827903 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:35.827959 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.827925 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-202.ec2.internal" Apr 23 17:52:35.836204 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.836189 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-202.ec2.internal" Apr 23 17:52:35.836286 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.836210 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-202.ec2.internal\": node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:35.847986 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.847964 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:35.898247 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.898214 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-202.ec2.internal"] Apr 23 17:52:35.898336 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.898286 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:35.899985 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.899971 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:35.900055 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.899998 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:35.900055 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.900012 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:35.901256 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.901243 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:35.901421 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.901409 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" Apr 23 17:52:35.901461 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.901437 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:35.901916 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.901903 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:35.901987 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.901918 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:35.901987 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.901941 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:35.901987 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.901951 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:35.901987 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.901922 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:35.902121 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.901999 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:35.903038 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.903021 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-202.ec2.internal" Apr 23 17:52:35.903116 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.903047 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:52:35.903670 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.903653 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:52:35.903764 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.903682 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:52:35.903764 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.903695 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:52:35.924426 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.924409 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-202.ec2.internal\" not found" node="ip-10-0-130-202.ec2.internal" Apr 23 17:52:35.928746 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.928729 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-202.ec2.internal\" not found" node="ip-10-0-130-202.ec2.internal" Apr 23 17:52:35.948564 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:35.948544 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:35.968620 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.968599 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc16a910ec1da67194fc461e51861565-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal\" (UID: \"fc16a910ec1da67194fc461e51861565\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" Apr 23 17:52:35.968701 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.968623 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc16a910ec1da67194fc461e51861565-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal\" (UID: \"fc16a910ec1da67194fc461e51861565\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" Apr 23 17:52:35.968701 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:35.968640 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5920d41245667f30b303f7ab0e3ca5d3-config\") pod \"kube-apiserver-proxy-ip-10-0-130-202.ec2.internal\" (UID: \"5920d41245667f30b303f7ab0e3ca5d3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-202.ec2.internal" Apr 23 17:52:36.049252 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:36.049187 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:36.069613 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.069594 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc16a910ec1da67194fc461e51861565-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal\" (UID: \"fc16a910ec1da67194fc461e51861565\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" Apr 23 17:52:36.069723 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.069625 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc16a910ec1da67194fc461e51861565-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal\" (UID: \"fc16a910ec1da67194fc461e51861565\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" Apr 23 17:52:36.069723 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.069644 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5920d41245667f30b303f7ab0e3ca5d3-config\") pod \"kube-apiserver-proxy-ip-10-0-130-202.ec2.internal\" (UID: \"5920d41245667f30b303f7ab0e3ca5d3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-202.ec2.internal" Apr 23 17:52:36.069723 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.069688 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5920d41245667f30b303f7ab0e3ca5d3-config\") pod \"kube-apiserver-proxy-ip-10-0-130-202.ec2.internal\" (UID: \"5920d41245667f30b303f7ab0e3ca5d3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-202.ec2.internal" Apr 23 17:52:36.069723 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.069710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc16a910ec1da67194fc461e51861565-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal\" (UID: \"fc16a910ec1da67194fc461e51861565\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" Apr 23 17:52:36.069870 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.069709 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc16a910ec1da67194fc461e51861565-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal\" (UID: \"fc16a910ec1da67194fc461e51861565\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" Apr 23 17:52:36.149995 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:36.149967 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:36.227484 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.227449 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" Apr 23 17:52:36.231174 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.231160 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-202.ec2.internal" Apr 23 17:52:36.250776 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:36.250751 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:36.351285 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:36.351201 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:36.451765 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:36.451732 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:36.552241 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:36.552209 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:36.568596 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.568574 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:52:36.568741 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.568717 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:52:36.568800 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.568745 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:52:36.652360 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:36.652281 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:36.665529 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.665511 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:52:36.668501 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.668463 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:47:35 +0000 UTC" deadline="2028-01-22 20:15:52.585790282 +0000 UTC" Apr 23 17:52:36.668590 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.668500 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15338h23m15.917294179s" Apr 23 17:52:36.679204 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.679185 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:52:36.686210 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.686191 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:52:36.699782 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.699759 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5dnff" Apr 23 17:52:36.707202 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.707186 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5dnff" Apr 23 17:52:36.752944 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:36.752922 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:36.801331 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.801277 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" event={"ID":"fc16a910ec1da67194fc461e51861565","Type":"ContainerStarted","Data":"4da68d0588b1e4afbca72e8c17a88800a86c561967e1804a3af2f89cdf894483"} Apr 23 17:52:36.802166 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:36.802143 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-202.ec2.internal" event={"ID":"5920d41245667f30b303f7ab0e3ca5d3","Type":"ContainerStarted","Data":"8175afc6e9e6d793abcf3fe4eba0d24163b7af9c63966c7f927ac1e27d8ce78a"} Apr 23 17:52:36.853351 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:36.853311 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:36.953923 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:36.953855 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:37.054400 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:37.054360 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:37.142677 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.142632 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:52:37.155474 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:52:37.155439 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-202.ec2.internal\" not found" Apr 23 17:52:37.248957 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.248717 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:52:37.265325 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.265293 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" Apr 23 17:52:37.276468 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.276445 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:52:37.277318 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.277294 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-202.ec2.internal" Apr 23 17:52:37.286126 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.286105 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:52:37.394513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.394481 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:52:37.641880 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.641849 2579 apiserver.go:52] "Watching apiserver" Apr 23 17:52:37.646869 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.646847 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:52:37.647106 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.647086 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jbbn4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal","kube-system/konnectivity-agent-84tsb","kube-system/kube-apiserver-proxy-ip-10-0-130-202.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l","openshift-cluster-node-tuning-operator/tuned-kqfft"] Apr 23 17:52:37.649985 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.649965 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jbbn4" Apr 23 17:52:37.650305 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.650288 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:52:37.651350 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.651321 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.652963 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.652940 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:52:37.653042 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.652946 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-vjs8g\"" Apr 23 17:52:37.653109 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.653094 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.653990 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.653968 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4nmd5\"" Apr 23 17:52:37.653990 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.653981 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:52:37.654143 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.654006 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:52:37.654143 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.654011 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:52:37.654379 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.654359 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:52:37.654379 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.654370 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:52:37.654525 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.654397 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:52:37.656069 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.656046 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bq2xd\"" Apr 23 17:52:37.656163 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.656115 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gzl8x\"" Apr 23 17:52:37.656224 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.656205 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:52:37.656337 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.656322 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:52:37.656337 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.656335 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:52:37.667014 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.666998 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:52:37.678859 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.678714 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-device-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.678859 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.678748 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-sysctl-d\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.678859 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.678776 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.678859 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.678810 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.678859 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.678853 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-tuned\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679127 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.678882 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkt66\" (UniqueName: \"kubernetes.io/projected/b6c26cb5-e282-4840-a6ae-c60523f49733-kube-api-access-hkt66\") pod \"node-ca-jbbn4\" (UID: \"b6c26cb5-e282-4840-a6ae-c60523f49733\") " pod="openshift-image-registry/node-ca-jbbn4" Apr 23 17:52:37.679127 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.678911 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-sysconfig\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679127 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.678935 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a1cf0add-6b75-4312-8f16-a023261bbef3-konnectivity-ca\") pod \"konnectivity-agent-84tsb\" (UID: \"a1cf0add-6b75-4312-8f16-a023261bbef3\") " pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:52:37.679127 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.678962 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhj56\" (UniqueName: \"kubernetes.io/projected/f1de2051-f9cd-44f4-b735-5b15499925c2-kube-api-access-fhj56\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.679127 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679005 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-systemd\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679127 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-host\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679127 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b6c26cb5-e282-4840-a6ae-c60523f49733-serviceca\") pod \"node-ca-jbbn4\" (UID: \"b6c26cb5-e282-4840-a6ae-c60523f49733\") " pod="openshift-image-registry/node-ca-jbbn4" Apr 23 17:52:37.679127 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679100 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-sysctl-conf\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679127 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679123 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-lib-modules\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679146 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdhjp\" (UniqueName: \"kubernetes.io/projected/12d2f5b6-b903-4c15-857e-f255d7d678fd-kube-api-access-rdhjp\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679168 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6c26cb5-e282-4840-a6ae-c60523f49733-host\") pod \"node-ca-jbbn4\" (UID: \"b6c26cb5-e282-4840-a6ae-c60523f49733\") " pod="openshift-image-registry/node-ca-jbbn4" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679210 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-socket-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679264 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-registration-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679302 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-sys-fs\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679347 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-kubernetes\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679374 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-sys\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679402 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-var-lib-kubelet\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679424 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12d2f5b6-b903-4c15-857e-f255d7d678fd-tmp\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679446 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-modprobe-d\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a1cf0add-6b75-4312-8f16-a023261bbef3-agent-certs\") pod \"konnectivity-agent-84tsb\" (UID: \"a1cf0add-6b75-4312-8f16-a023261bbef3\") " pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:52:37.679548 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.679535 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-run\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.708475 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.708443 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:47:36 +0000 UTC" deadline="2027-11-25 04:44:52.783221452 +0000 UTC" Apr 23 17:52:37.708475 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.708470 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13930h52m15.07475327s" Apr 23 17:52:37.780394 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780357 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-modprobe-d\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780560 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780398 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a1cf0add-6b75-4312-8f16-a023261bbef3-agent-certs\") pod \"konnectivity-agent-84tsb\" (UID: \"a1cf0add-6b75-4312-8f16-a023261bbef3\") " pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:52:37.780560 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780431 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-run\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780560 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-device-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.780560 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780486 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-sysctl-d\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780560 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780511 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.780560 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.780560 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780552 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-modprobe-d\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780559 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-tuned\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780590 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-device-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780590 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-run\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780623 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkt66\" (UniqueName: \"kubernetes.io/projected/b6c26cb5-e282-4840-a6ae-c60523f49733-kube-api-access-hkt66\") pod \"node-ca-jbbn4\" (UID: \"b6c26cb5-e282-4840-a6ae-c60523f49733\") " pod="openshift-image-registry/node-ca-jbbn4" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780651 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780652 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-sysconfig\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a1cf0add-6b75-4312-8f16-a023261bbef3-konnectivity-ca\") pod \"konnectivity-agent-84tsb\" (UID: \"a1cf0add-6b75-4312-8f16-a023261bbef3\") " pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhj56\" (UniqueName: \"kubernetes.io/projected/f1de2051-f9cd-44f4-b735-5b15499925c2-kube-api-access-fhj56\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780726 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-sysctl-d\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780750 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-systemd\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780773 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-host\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780783 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780795 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b6c26cb5-e282-4840-a6ae-c60523f49733-serviceca\") pod \"node-ca-jbbn4\" (UID: \"b6c26cb5-e282-4840-a6ae-c60523f49733\") " pod="openshift-image-registry/node-ca-jbbn4" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780819 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-sysctl-conf\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-etc-selinux\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780872 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-lib-modules\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.780891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780896 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdhjp\" (UniqueName: \"kubernetes.io/projected/12d2f5b6-b903-4c15-857e-f255d7d678fd-kube-api-access-rdhjp\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780908 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-sysconfig\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6c26cb5-e282-4840-a6ae-c60523f49733-host\") pod \"node-ca-jbbn4\" (UID: \"b6c26cb5-e282-4840-a6ae-c60523f49733\") " pod="openshift-image-registry/node-ca-jbbn4" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780946 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-socket-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780970 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-registration-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780975 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-host\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780983 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-systemd\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.780993 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-sys-fs\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-kubernetes\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781043 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-sys\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781067 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-var-lib-kubelet\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781089 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12d2f5b6-b903-4c15-857e-f255d7d678fd-tmp\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781193 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-sys-fs\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-sysctl-conf\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781243 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6c26cb5-e282-4840-a6ae-c60523f49733-host\") pod \"node-ca-jbbn4\" (UID: \"b6c26cb5-e282-4840-a6ae-c60523f49733\") " pod="openshift-image-registry/node-ca-jbbn4" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781303 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a1cf0add-6b75-4312-8f16-a023261bbef3-konnectivity-ca\") pod \"konnectivity-agent-84tsb\" (UID: \"a1cf0add-6b75-4312-8f16-a023261bbef3\") " pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781326 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-socket-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781337 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-lib-modules\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.781416 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781372 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b6c26cb5-e282-4840-a6ae-c60523f49733-serviceca\") pod \"node-ca-jbbn4\" (UID: \"b6c26cb5-e282-4840-a6ae-c60523f49733\") " pod="openshift-image-registry/node-ca-jbbn4" Apr 23 17:52:37.782372 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1de2051-f9cd-44f4-b735-5b15499925c2-registration-dir\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.782372 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781507 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-sys\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.782372 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781538 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-var-lib-kubelet\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.782372 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.781575 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-kubernetes\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.784108 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.784081 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/12d2f5b6-b903-4c15-857e-f255d7d678fd-etc-tuned\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.784215 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.784123 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12d2f5b6-b903-4c15-857e-f255d7d678fd-tmp\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.784215 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.784210 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a1cf0add-6b75-4312-8f16-a023261bbef3-agent-certs\") pod \"konnectivity-agent-84tsb\" (UID: \"a1cf0add-6b75-4312-8f16-a023261bbef3\") " pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:52:37.789683 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.789663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkt66\" (UniqueName: \"kubernetes.io/projected/b6c26cb5-e282-4840-a6ae-c60523f49733-kube-api-access-hkt66\") pod \"node-ca-jbbn4\" (UID: \"b6c26cb5-e282-4840-a6ae-c60523f49733\") " pod="openshift-image-registry/node-ca-jbbn4" Apr 23 17:52:37.789903 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.789882 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdhjp\" (UniqueName: \"kubernetes.io/projected/12d2f5b6-b903-4c15-857e-f255d7d678fd-kube-api-access-rdhjp\") pod \"tuned-kqfft\" (UID: \"12d2f5b6-b903-4c15-857e-f255d7d678fd\") " pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:37.790809 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.790789 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhj56\" (UniqueName: \"kubernetes.io/projected/f1de2051-f9cd-44f4-b735-5b15499925c2-kube-api-access-fhj56\") pod \"aws-ebs-csi-driver-node-vps6l\" (UID: \"f1de2051-f9cd-44f4-b735-5b15499925c2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.902915 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.902836 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:52:37.961502 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.961455 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jbbn4" Apr 23 17:52:37.968336 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.968312 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:52:37.977214 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.977194 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" Apr 23 17:52:37.981795 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:37.981778 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kqfft" Apr 23 17:52:38.260061 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:38.260035 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1de2051_f9cd_44f4_b735_5b15499925c2.slice/crio-73b7dc3e2dcfa50a5c20ec374d4ebef635bfb6a29054164caf4525000e89783d WatchSource:0}: Error finding container 73b7dc3e2dcfa50a5c20ec374d4ebef635bfb6a29054164caf4525000e89783d: Status 404 returned error can't find the container with id 73b7dc3e2dcfa50a5c20ec374d4ebef635bfb6a29054164caf4525000e89783d Apr 23 17:52:38.261982 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:38.260664 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1cf0add_6b75_4312_8f16_a023261bbef3.slice/crio-4fddb87d500d9176bae1d491baeb65c3936a4879aecc57090d3e84c850dc33a7 WatchSource:0}: Error finding container 4fddb87d500d9176bae1d491baeb65c3936a4879aecc57090d3e84c850dc33a7: Status 404 returned error can't find the container with id 4fddb87d500d9176bae1d491baeb65c3936a4879aecc57090d3e84c850dc33a7 Apr 23 17:52:38.263755 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:52:38.263728 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6c26cb5_e282_4840_a6ae_c60523f49733.slice/crio-16c6924215089ac3235de2cc966a174ba89132bd9eb7ba6e3fcc8d0bf3c819ae WatchSource:0}: Error finding container 16c6924215089ac3235de2cc966a174ba89132bd9eb7ba6e3fcc8d0bf3c819ae: Status 404 returned error can't find the container with id 16c6924215089ac3235de2cc966a174ba89132bd9eb7ba6e3fcc8d0bf3c819ae Apr 23 17:52:38.709203 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:38.708914 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:47:36 +0000 UTC" deadline="2028-01-26 21:47:52.519107528 +0000 UTC" Apr 23 17:52:38.709203 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:38.709151 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15435h55m13.809960741s" Apr 23 17:52:38.808742 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:38.807504 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-84tsb" event={"ID":"a1cf0add-6b75-4312-8f16-a023261bbef3","Type":"ContainerStarted","Data":"4fddb87d500d9176bae1d491baeb65c3936a4879aecc57090d3e84c850dc33a7"} Apr 23 17:52:38.809886 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:38.809861 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" event={"ID":"f1de2051-f9cd-44f4-b735-5b15499925c2","Type":"ContainerStarted","Data":"73b7dc3e2dcfa50a5c20ec374d4ebef635bfb6a29054164caf4525000e89783d"} Apr 23 17:52:38.814644 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:38.814615 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-202.ec2.internal" event={"ID":"5920d41245667f30b303f7ab0e3ca5d3","Type":"ContainerStarted","Data":"c7ce09b2dc9a00aba529b41519bb0f77448f3cb2ba968a4fba4d243c0d9c00e7"} Apr 23 17:52:38.816887 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:38.816862 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jbbn4" event={"ID":"b6c26cb5-e282-4840-a6ae-c60523f49733","Type":"ContainerStarted","Data":"16c6924215089ac3235de2cc966a174ba89132bd9eb7ba6e3fcc8d0bf3c819ae"} Apr 23 17:52:38.822970 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:38.822946 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kqfft" event={"ID":"12d2f5b6-b903-4c15-857e-f255d7d678fd","Type":"ContainerStarted","Data":"c57999ad9465a39cb9c5f7d1339845a3fd4e6e1cc37f280ad75f03d59dbb5b81"} Apr 23 17:52:39.827260 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:39.827218 2579 generic.go:358] "Generic (PLEG): container finished" podID="fc16a910ec1da67194fc461e51861565" containerID="0a7489c2c45c2e63d91ca243439663a523bbd9f723f325499d07344e3d169e0f" exitCode=0 Apr 23 17:52:39.828162 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:39.828133 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" event={"ID":"fc16a910ec1da67194fc461e51861565","Type":"ContainerDied","Data":"0a7489c2c45c2e63d91ca243439663a523bbd9f723f325499d07344e3d169e0f"} Apr 23 17:52:39.841226 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:39.841177 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-202.ec2.internal" podStartSLOduration=2.841163292 podStartE2EDuration="2.841163292s" podCreationTimestamp="2026-04-23 17:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:52:38.828611918 +0000 UTC m=+3.643977566" watchObservedRunningTime="2026-04-23 17:52:39.841163292 +0000 UTC m=+4.656528937" Apr 23 17:52:42.832512 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:42.832466 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jbbn4" event={"ID":"b6c26cb5-e282-4840-a6ae-c60523f49733","Type":"ContainerStarted","Data":"937a992ef649073dc2a8d87f2163ecbd7657420a5a6956b3bb899d047590330a"} Apr 23 17:52:42.833990 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:42.833962 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kqfft" event={"ID":"12d2f5b6-b903-4c15-857e-f255d7d678fd","Type":"ContainerStarted","Data":"353842b9080db8b181f41aa0e0875d704f03a1c9fcfa22c48e713b05f9c75364"} Apr 23 17:52:42.835285 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:42.835259 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-84tsb" event={"ID":"a1cf0add-6b75-4312-8f16-a023261bbef3","Type":"ContainerStarted","Data":"4f16cd10b5a885d7e3d0f0b33a4d68f38a9c2ac173dc992da7dc80fe9cdae6ab"} Apr 23 17:52:42.836601 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:42.836579 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" event={"ID":"f1de2051-f9cd-44f4-b735-5b15499925c2","Type":"ContainerStarted","Data":"acd765c4698ce38531e432ad59461ec7f3a952e3a0b0e89020952cdec86a8dbf"} Apr 23 17:52:42.838254 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:42.838228 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" event={"ID":"fc16a910ec1da67194fc461e51861565","Type":"ContainerStarted","Data":"610f707ec1193f98af46ca64796e87dbba26373d0944fe1cdda2af565b81d93c"} Apr 23 17:52:42.846167 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:42.846127 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jbbn4" podStartSLOduration=3.957718222 podStartE2EDuration="7.846113901s" podCreationTimestamp="2026-04-23 17:52:35 +0000 UTC" firstStartedPulling="2026-04-23 17:52:38.265770612 +0000 UTC m=+3.081136237" lastFinishedPulling="2026-04-23 17:52:42.154166288 +0000 UTC m=+6.969531916" observedRunningTime="2026-04-23 17:52:42.845696442 +0000 UTC m=+7.661062089" watchObservedRunningTime="2026-04-23 17:52:42.846113901 +0000 UTC m=+7.661479549" Apr 23 17:52:42.859480 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:42.859440 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kqfft" podStartSLOduration=3.908433424 podStartE2EDuration="7.859429962s" podCreationTimestamp="2026-04-23 17:52:35 +0000 UTC" firstStartedPulling="2026-04-23 17:52:38.264891049 +0000 UTC m=+3.080256675" lastFinishedPulling="2026-04-23 17:52:42.21588757 +0000 UTC m=+7.031253213" observedRunningTime="2026-04-23 17:52:42.859256389 +0000 UTC m=+7.674622038" watchObservedRunningTime="2026-04-23 17:52:42.859429962 +0000 UTC m=+7.674795609" Apr 23 17:52:42.877239 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:42.877183 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-202.ec2.internal" podStartSLOduration=5.877140067 podStartE2EDuration="5.877140067s" podCreationTimestamp="2026-04-23 17:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:52:42.876966135 +0000 UTC m=+7.692331785" watchObservedRunningTime="2026-04-23 17:52:42.877140067 +0000 UTC m=+7.692505718" Apr 23 17:52:42.891818 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:42.891781 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-84tsb" podStartSLOduration=3.999947493 podStartE2EDuration="7.891765427s" podCreationTimestamp="2026-04-23 17:52:35 +0000 UTC" firstStartedPulling="2026-04-23 17:52:38.264872057 +0000 UTC m=+3.080237689" lastFinishedPulling="2026-04-23 17:52:42.156689998 +0000 UTC m=+6.972055623" observedRunningTime="2026-04-23 17:52:42.891726552 +0000 UTC m=+7.707092212" watchObservedRunningTime="2026-04-23 17:52:42.891765427 +0000 UTC m=+7.707131077" Apr 23 17:52:43.222806 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:43.222778 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:52:43.731117 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:43.731018 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:52:43.222804786Z","UUID":"e21efd3b-0746-47a0-ac74-995018cf09d0","Handler":null,"Name":"","Endpoint":""} Apr 23 17:52:43.734248 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:43.734218 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:52:43.734248 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:43.734251 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:52:43.841658 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:43.841628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" event={"ID":"f1de2051-f9cd-44f4-b735-5b15499925c2","Type":"ContainerStarted","Data":"4a059c7a169d51bc0d065e0ce5ef44e60d63b1479d8924c9f035b6b6986edd9f"} Apr 23 17:52:44.844834 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:44.844780 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" event={"ID":"f1de2051-f9cd-44f4-b735-5b15499925c2","Type":"ContainerStarted","Data":"f8825940cb2c69b35ed842b03f7d43aef922d04c19d92aa9d053a01fe1764626"} Apr 23 17:52:46.959027 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:46.958996 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:52:46.959675 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:46.959658 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:52:46.973446 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:52:46.973403 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vps6l" podStartSLOduration=6.34143246 podStartE2EDuration="11.973390468s" podCreationTimestamp="2026-04-23 17:52:35 +0000 UTC" firstStartedPulling="2026-04-23 17:52:38.263037199 +0000 UTC m=+3.078402829" lastFinishedPulling="2026-04-23 17:52:43.894995211 +0000 UTC m=+8.710360837" observedRunningTime="2026-04-23 17:52:44.860529017 +0000 UTC m=+9.675894665" watchObservedRunningTime="2026-04-23 17:52:46.973390468 +0000 UTC m=+11.788756119" Apr 23 17:53:01.294430 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:53:01.294395 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:53:01.294842 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:53:01.294532 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 17:53:01.294973 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:53:01.294957 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-84tsb" Apr 23 17:54:35.669875 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:54:35.669691 2579 kubelet_node_status.go:509] "Node not becoming ready in time after startup" Apr 23 17:54:35.745465 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:54:35.745433 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:40.746708 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:54:40.746663 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:45.747512 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:54:45.747474 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:50.748137 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:54:50.748096 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:54:55.748569 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:54:55.748530 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:00.749574 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:00.749537 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:02.817848 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.817792 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-2zdbf"] Apr 23 17:55:02.820293 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.820276 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.826970 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.826944 2579 status_manager.go:895] "Failed to get status for pod" podUID="ed24d47a-1604-4017-b22f-4d68978cdb27" pod="openshift-multus/multus-2zdbf" err="pods \"multus-2zdbf\" is forbidden: User \"system:node:ip-10-0-130-202.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'ip-10-0-130-202.ec2.internal' and this object" Apr 23 17:55:02.827063 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:02.826966 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-130-202.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'ip-10-0-130-202.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 23 17:55:02.827277 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:02.827255 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-130-202.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'ip-10-0-130-202.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 23 17:55:02.827560 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:02.827541 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"default-dockercfg-4kn7n\" is forbidden: User \"system:node:ip-10-0-130-202.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'ip-10-0-130-202.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-multus\"/\"default-dockercfg-4kn7n\"" type="*v1.Secret" Apr 23 17:55:02.827934 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:02.827914 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:ip-10-0-130-202.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'ip-10-0-130-202.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" type="*v1.ConfigMap" Apr 23 17:55:02.828006 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:02.827948 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:ip-10-0-130-202.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'ip-10-0-130-202.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" type="*v1.ConfigMap" Apr 23 17:55:02.914019 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.913985 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-cni-dir\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914019 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914018 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-socket-dir-parent\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914198 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-hostroot\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914198 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914062 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-run-multus-certs\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914198 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914087 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-var-lib-cni-multus\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914198 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914104 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-conf-dir\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914198 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914117 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-etc-kubernetes\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914198 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-run-k8s-cni-cncf-io\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914198 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914171 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-daemon-config\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914420 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914230 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-system-cni-dir\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914420 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914249 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-cnibin\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914420 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed24d47a-1604-4017-b22f-4d68978cdb27-cni-binary-copy\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914420 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914301 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-os-release\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914420 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914315 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-run-netns\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914420 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914328 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-var-lib-cni-bin\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914420 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914343 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-var-lib-kubelet\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:02.914420 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:02.914356 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ftzr\" (UniqueName: \"kubernetes.io/projected/ed24d47a-1604-4017-b22f-4d68978cdb27-kube-api-access-8ftzr\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015310 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015281 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-system-cni-dir\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015422 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-cnibin\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015422 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015332 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed24d47a-1604-4017-b22f-4d68978cdb27-cni-binary-copy\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015422 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015356 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-os-release\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015422 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015376 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-run-netns\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015422 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015400 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-var-lib-cni-bin\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015422 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015414 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-cnibin\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015414 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-system-cni-dir\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015456 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-os-release\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015460 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-run-netns\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015511 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-var-lib-cni-bin\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-var-lib-kubelet\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015549 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ftzr\" (UniqueName: \"kubernetes.io/projected/ed24d47a-1604-4017-b22f-4d68978cdb27-kube-api-access-8ftzr\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-cni-dir\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015579 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-var-lib-kubelet\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-socket-dir-parent\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-hostroot\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.015629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015620 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-run-multus-certs\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015637 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-var-lib-cni-multus\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015656 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-conf-dir\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015666 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-hostroot\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015681 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-etc-kubernetes\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015690 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-run-multus-certs\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015705 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-run-k8s-cni-cncf-io\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015707 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-conf-dir\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015709 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-var-lib-cni-multus\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015731 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-etc-kubernetes\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015739 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-daemon-config\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015667 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-socket-dir-parent\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015733 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-host-run-k8s-cni-cncf-io\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.016092 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.015698 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-cni-dir\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.024665 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.023462 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-p4bt4"] Apr 23 17:55:03.027064 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.027046 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.030246 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.030227 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-sv4gq\"" Apr 23 17:55:03.030363 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.030301 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:55:03.030465 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.030450 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:55:03.116639 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.116563 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.116639 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.116597 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-os-release\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.116639 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.116618 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.116858 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.116641 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pk26\" (UniqueName: \"kubernetes.io/projected/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-kube-api-access-9pk26\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.116858 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.116668 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-system-cni-dir\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.116858 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.116686 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-cni-binary-copy\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.116858 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.116705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-cnibin\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.116858 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.116719 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217086 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217053 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-cni-binary-copy\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217234 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217097 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-cnibin\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217234 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-cnibin\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217234 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217182 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217234 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217227 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217421 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-os-release\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217421 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217365 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217421 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217421 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217394 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pk26\" (UniqueName: \"kubernetes.io/projected/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-kube-api-access-9pk26\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217571 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-system-cni-dir\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217571 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217456 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-os-release\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217571 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217479 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-system-cni-dir\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.217713 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.217695 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.218196 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.218179 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.733630 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.733602 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:55:03.737744 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.737724 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed24d47a-1604-4017-b22f-4d68978cdb27-cni-binary-copy\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.738470 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.738454 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-cni-binary-copy\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.800925 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.800898 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-gnxs9"] Apr 23 17:55:03.803861 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.803842 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:03.803961 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:03.803926 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:03.867208 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.867178 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:55:03.921329 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.921300 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjt2x\" (UniqueName: \"kubernetes.io/projected/67c4e3ae-cc88-433d-8549-c77153e2e1d6-kube-api-access-tjt2x\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:03.921458 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.921335 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:03.924077 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.924059 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:55:03.929878 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.929855 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pk26\" (UniqueName: \"kubernetes.io/projected/fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b-kube-api-access-9pk26\") pod \"multus-additional-cni-plugins-p4bt4\" (UID: \"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b\") " pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.929878 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.929876 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ftzr\" (UniqueName: \"kubernetes.io/projected/ed24d47a-1604-4017-b22f-4d68978cdb27-kube-api-access-8ftzr\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:03.934760 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:03.934736 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" Apr 23 17:55:03.942324 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:55:03.942304 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc6e02b3_9721_4d2d_aa26_4ed9f4f1607b.slice/crio-55bb12d46cea91024ddea6ca5ef512bbcf451e0dd0a66e792692583a55a7d5e8 WatchSource:0}: Error finding container 55bb12d46cea91024ddea6ca5ef512bbcf451e0dd0a66e792692583a55a7d5e8: Status 404 returned error can't find the container with id 55bb12d46cea91024ddea6ca5ef512bbcf451e0dd0a66e792692583a55a7d5e8 Apr 23 17:55:04.015868 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:04.015843 2579 configmap.go:193] Couldn't get configMap openshift-multus/multus-daemon-config: failed to sync configmap cache: timed out waiting for the condition Apr 23 17:55:04.016012 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:04.015938 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-daemon-config podName:ed24d47a-1604-4017-b22f-4d68978cdb27 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:04.51590784 +0000 UTC m=+149.331273482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "multus-daemon-config" (UniqueName: "kubernetes.io/configmap/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-daemon-config") pod "multus-2zdbf" (UID: "ed24d47a-1604-4017-b22f-4d68978cdb27") : failed to sync configmap cache: timed out waiting for the condition Apr 23 17:55:04.021723 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:04.021706 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:04.021792 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:04.021742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjt2x\" (UniqueName: \"kubernetes.io/projected/67c4e3ae-cc88-433d-8549-c77153e2e1d6-kube-api-access-tjt2x\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:04.021921 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:04.021901 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:04.022004 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:04.021991 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs podName:67c4e3ae-cc88-433d-8549-c77153e2e1d6 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:04.521972074 +0000 UTC m=+149.337337704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs") pod "network-metrics-daemon-gnxs9" (UID: "67c4e3ae-cc88-433d-8549-c77153e2e1d6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:04.031299 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:04.031273 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjt2x\" (UniqueName: \"kubernetes.io/projected/67c4e3ae-cc88-433d-8549-c77153e2e1d6-kube-api-access-tjt2x\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:04.036852 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:04.036803 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" event={"ID":"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b","Type":"ContainerStarted","Data":"55bb12d46cea91024ddea6ca5ef512bbcf451e0dd0a66e792692583a55a7d5e8"} Apr 23 17:55:04.066121 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:04.066101 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-4kn7n\"" Apr 23 17:55:04.144317 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:04.144288 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:55:04.525317 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:04.525284 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-daemon-config\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:04.525317 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:04.525323 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:04.525547 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:04.525437 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:04.525547 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:04.525506 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs podName:67c4e3ae-cc88-433d-8549-c77153e2e1d6 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:05.525490763 +0000 UTC m=+150.340856389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs") pod "network-metrics-daemon-gnxs9" (UID: "67c4e3ae-cc88-433d-8549-c77153e2e1d6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:04.525906 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:04.525887 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed24d47a-1604-4017-b22f-4d68978cdb27-multus-daemon-config\") pod \"multus-2zdbf\" (UID: \"ed24d47a-1604-4017-b22f-4d68978cdb27\") " pod="openshift-multus/multus-2zdbf" Apr 23 17:55:04.628650 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:04.628613 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2zdbf" Apr 23 17:55:04.673027 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:55:04.672995 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded24d47a_1604_4017_b22f_4d68978cdb27.slice/crio-21af161f5999349fa4180a034b30bb36586583371c718725499e07bcbab2e521 WatchSource:0}: Error finding container 21af161f5999349fa4180a034b30bb36586583371c718725499e07bcbab2e521: Status 404 returned error can't find the container with id 21af161f5999349fa4180a034b30bb36586583371c718725499e07bcbab2e521 Apr 23 17:55:05.039917 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:05.039879 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2zdbf" event={"ID":"ed24d47a-1604-4017-b22f-4d68978cdb27","Type":"ContainerStarted","Data":"21af161f5999349fa4180a034b30bb36586583371c718725499e07bcbab2e521"} Apr 23 17:55:05.530436 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:05.530396 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:05.530620 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:05.530572 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:05.530687 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:05.530650 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs podName:67c4e3ae-cc88-433d-8549-c77153e2e1d6 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:07.530629779 +0000 UTC m=+152.345995420 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs") pod "network-metrics-daemon-gnxs9" (UID: "67c4e3ae-cc88-433d-8549-c77153e2e1d6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:05.751022 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:05.750988 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:05.799746 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:05.799284 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:05.799746 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:05.799415 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:06.043457 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:06.043375 2579 generic.go:358] "Generic (PLEG): container finished" podID="fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b" containerID="bd9203b04f778f7807a45aca90abb2c9080e9f6fc8682336d681d9f1b4df1be5" exitCode=0 Apr 23 17:55:06.043457 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:06.043435 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" event={"ID":"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b","Type":"ContainerDied","Data":"bd9203b04f778f7807a45aca90abb2c9080e9f6fc8682336d681d9f1b4df1be5"} Apr 23 17:55:07.542853 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:07.542742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:07.543309 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:07.542907 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:07.543309 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:07.542995 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs podName:67c4e3ae-cc88-433d-8549-c77153e2e1d6 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:11.542972963 +0000 UTC m=+156.358338601 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs") pod "network-metrics-daemon-gnxs9" (UID: "67c4e3ae-cc88-433d-8549-c77153e2e1d6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:07.799330 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:07.799110 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:07.799330 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:07.799249 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:09.799041 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:09.798973 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:09.799577 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:09.799117 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:10.752607 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:10.752555 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:11.569752 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:11.569471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:11.569752 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:11.569722 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:11.570306 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:11.569818 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs podName:67c4e3ae-cc88-433d-8549-c77153e2e1d6 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:19.569800346 +0000 UTC m=+164.385165985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs") pod "network-metrics-daemon-gnxs9" (UID: "67c4e3ae-cc88-433d-8549-c77153e2e1d6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:11.799243 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:11.799211 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:11.799414 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:11.799356 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:13.623976 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.623947 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ftvrg"] Apr 23 17:55:13.627011 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.626992 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.631287 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.631265 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:55:13.631812 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.631777 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:55:13.632782 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.632757 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:55:13.633478 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.633462 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:55:13.634214 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.634188 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:55:13.634326 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.634195 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kznb8\"" Apr 23 17:55:13.634531 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.634513 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:55:13.780636 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.780594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-ovn\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.780751 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.780652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-slash\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.780751 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.780681 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-openvswitch\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.780751 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.780706 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-log-socket\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.780751 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.780728 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-netd\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.780996 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.780756 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-systemd\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.780996 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.780778 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-etc-openvswitch\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.780996 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.780870 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-env-overrides\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781145 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.780987 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-kubelet\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781145 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.781056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-node-log\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781145 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.781099 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-bin\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781145 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.781130 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-var-lib-openvswitch\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781305 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.781161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovn-node-metrics-cert\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781305 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.781187 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h2kb\" (UniqueName: \"kubernetes.io/projected/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-kube-api-access-2h2kb\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781305 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.781211 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-config\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781305 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.781235 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-systemd-units\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781305 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.781258 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781305 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.781284 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781580 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.781310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-script-lib\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.781580 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.781336 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-netns\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.798623 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.798597 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:13.798776 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:13.798750 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:13.882445 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882445 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882376 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-script-lib\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882445 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-netns\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882445 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882424 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-ovn\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-slash\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-openvswitch\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882479 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882514 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-netns\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882537 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-log-socket\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882521 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-ovn\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-log-socket\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882565 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-openvswitch\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-netd\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882563 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-slash\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-systemd\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882650 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-systemd\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882655 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-netd\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-etc-openvswitch\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.882736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882717 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-env-overrides\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882767 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-kubelet\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882776 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-etc-openvswitch\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882793 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-node-log\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882867 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-node-log\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882878 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-kubelet\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882916 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-bin\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882948 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-var-lib-openvswitch\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882981 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovn-node-metrics-cert\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.882984 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-bin\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.883010 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2h2kb\" (UniqueName: \"kubernetes.io/projected/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-kube-api-access-2h2kb\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.883018 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-script-lib\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.883025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-var-lib-openvswitch\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.883039 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-config\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.883063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-systemd-units\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.883087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.883147 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.883252 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-env-overrides\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883908 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.883317 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-systemd-units\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.883908 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.883521 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-config\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.886528 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.886505 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovn-node-metrics-cert\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.892862 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.892807 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h2kb\" (UniqueName: \"kubernetes.io/projected/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-kube-api-access-2h2kb\") pod \"ovnkube-node-ftvrg\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.938524 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:13.938490 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:13.980684 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:55:13.980648 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad88c127_ee8d_4859_a2a7_6bfbe501c33f.slice/crio-85f67c62c9f5b7199ee35dfb47e4defeaf8c2469141dc33beb5b282537727af5 WatchSource:0}: Error finding container 85f67c62c9f5b7199ee35dfb47e4defeaf8c2469141dc33beb5b282537727af5: Status 404 returned error can't find the container with id 85f67c62c9f5b7199ee35dfb47e4defeaf8c2469141dc33beb5b282537727af5 Apr 23 17:55:14.060093 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:14.059970 2579 generic.go:358] "Generic (PLEG): container finished" podID="fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b" containerID="3aef3bebecfa8e1abe19aabfcaf9cd9acf56643e8c5ab5042dd6cac5a4a9bcb7" exitCode=0 Apr 23 17:55:14.060093 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:14.060014 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" event={"ID":"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b","Type":"ContainerDied","Data":"3aef3bebecfa8e1abe19aabfcaf9cd9acf56643e8c5ab5042dd6cac5a4a9bcb7"} Apr 23 17:55:14.061358 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:14.061326 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2zdbf" event={"ID":"ed24d47a-1604-4017-b22f-4d68978cdb27","Type":"ContainerStarted","Data":"3b7ac5a2e063336df9d2eefdba5201419f322557a228ed55b888795f3311c042"} Apr 23 17:55:14.062364 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:14.062344 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerStarted","Data":"85f67c62c9f5b7199ee35dfb47e4defeaf8c2469141dc33beb5b282537727af5"} Apr 23 17:55:15.066949 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:15.066890 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" event={"ID":"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b","Type":"ContainerStarted","Data":"99c488b89f243ecabc0c5ac0bb98d6b31cf59ebd991ecb3a1b8d4164859ab68f"} Apr 23 17:55:15.091514 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:15.091458 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2zdbf" podStartSLOduration=4.010749543 podStartE2EDuration="13.091441562s" podCreationTimestamp="2026-04-23 17:55:02 +0000 UTC" firstStartedPulling="2026-04-23 17:55:04.675037995 +0000 UTC m=+149.490403627" lastFinishedPulling="2026-04-23 17:55:13.755730001 +0000 UTC m=+158.571095646" observedRunningTime="2026-04-23 17:55:14.097039356 +0000 UTC m=+158.912405004" watchObservedRunningTime="2026-04-23 17:55:15.091441562 +0000 UTC m=+159.906807212" Apr 23 17:55:15.753244 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:15.753201 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:15.799398 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:15.799358 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:15.799567 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:15.799489 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:16.070857 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:16.070758 2579 generic.go:358] "Generic (PLEG): container finished" podID="fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b" containerID="99c488b89f243ecabc0c5ac0bb98d6b31cf59ebd991ecb3a1b8d4164859ab68f" exitCode=0 Apr 23 17:55:16.070857 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:16.070814 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" event={"ID":"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b","Type":"ContainerDied","Data":"99c488b89f243ecabc0c5ac0bb98d6b31cf59ebd991ecb3a1b8d4164859ab68f"} Apr 23 17:55:16.583376 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:16.583337 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-qxm9s"] Apr 23 17:55:16.586246 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:16.586221 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:16.586386 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:16.586303 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:16.706078 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:16.706046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgh75\" (UniqueName: \"kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75\") pod \"network-check-target-qxm9s\" (UID: \"76db35ef-2b60-42ed-bb55-38ec1534f90f\") " pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:16.807372 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:16.807334 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgh75\" (UniqueName: \"kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75\") pod \"network-check-target-qxm9s\" (UID: \"76db35ef-2b60-42ed-bb55-38ec1534f90f\") " pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:16.814530 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:16.814503 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:16.814530 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:16.814535 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:16.814714 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:16.814550 2579 projected.go:194] Error preparing data for projected volume kube-api-access-sgh75 for pod openshift-network-diagnostics/network-check-target-qxm9s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:16.814714 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:16.814626 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75 podName:76db35ef-2b60-42ed-bb55-38ec1534f90f nodeName:}" failed. No retries permitted until 2026-04-23 17:55:17.314605229 +0000 UTC m=+162.129970859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sgh75" (UniqueName: "kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75") pod "network-check-target-qxm9s" (UID: "76db35ef-2b60-42ed-bb55-38ec1534f90f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:17.075507 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:17.075462 2579 generic.go:358] "Generic (PLEG): container finished" podID="fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b" containerID="64d6567710a26753f058bbdd3a929d0c8f5b82b31cb2cded513eff8b17201807" exitCode=0 Apr 23 17:55:17.075507 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:17.075508 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" event={"ID":"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b","Type":"ContainerDied","Data":"64d6567710a26753f058bbdd3a929d0c8f5b82b31cb2cded513eff8b17201807"} Apr 23 17:55:17.413129 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:17.413028 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgh75\" (UniqueName: \"kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75\") pod \"network-check-target-qxm9s\" (UID: \"76db35ef-2b60-42ed-bb55-38ec1534f90f\") " pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:17.413306 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:17.413191 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:17.413306 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:17.413213 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:17.413306 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:17.413223 2579 projected.go:194] Error preparing data for projected volume kube-api-access-sgh75 for pod openshift-network-diagnostics/network-check-target-qxm9s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:17.413306 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:17.413294 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75 podName:76db35ef-2b60-42ed-bb55-38ec1534f90f nodeName:}" failed. No retries permitted until 2026-04-23 17:55:18.413278155 +0000 UTC m=+163.228643781 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgh75" (UniqueName: "kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75") pod "network-check-target-qxm9s" (UID: "76db35ef-2b60-42ed-bb55-38ec1534f90f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:17.798644 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:17.798585 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:17.798644 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:17.798585 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:17.799018 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:17.798979 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:17.799279 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:17.799257 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:18.418666 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:18.418617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgh75\" (UniqueName: \"kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75\") pod \"network-check-target-qxm9s\" (UID: \"76db35ef-2b60-42ed-bb55-38ec1534f90f\") " pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:18.419170 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:18.418805 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:18.419170 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:18.418854 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:18.419170 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:18.418870 2579 projected.go:194] Error preparing data for projected volume kube-api-access-sgh75 for pod openshift-network-diagnostics/network-check-target-qxm9s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:18.419170 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:18.418936 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75 podName:76db35ef-2b60-42ed-bb55-38ec1534f90f nodeName:}" failed. No retries permitted until 2026-04-23 17:55:20.418916059 +0000 UTC m=+165.234281699 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgh75" (UniqueName: "kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75") pod "network-check-target-qxm9s" (UID: "76db35ef-2b60-42ed-bb55-38ec1534f90f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:19.626891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:19.626848 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:19.627469 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:19.626999 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:19.627469 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:19.627073 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs podName:67c4e3ae-cc88-433d-8549-c77153e2e1d6 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:35.627051357 +0000 UTC m=+180.442416988 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs") pod "network-metrics-daemon-gnxs9" (UID: "67c4e3ae-cc88-433d-8549-c77153e2e1d6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:19.798459 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:19.798346 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:19.798459 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:19.798353 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:19.798459 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:19.798452 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:19.798750 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:19.798577 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:19.999430 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:19.999344 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-nm928"] Apr 23 17:55:20.001255 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.001230 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nm928" Apr 23 17:55:20.004326 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.004161 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-d25k5\"" Apr 23 17:55:20.005904 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.005882 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:55:20.006036 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.005916 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:55:20.006036 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.005916 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:55:20.130415 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.130328 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8d9bbd2-cf59-434c-a2f6-2abd36d2113b-iptables-alerter-script\") pod \"iptables-alerter-nm928\" (UID: \"f8d9bbd2-cf59-434c-a2f6-2abd36d2113b\") " pod="openshift-network-operator/iptables-alerter-nm928" Apr 23 17:55:20.130415 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.130378 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7xh2\" (UniqueName: \"kubernetes.io/projected/f8d9bbd2-cf59-434c-a2f6-2abd36d2113b-kube-api-access-n7xh2\") pod \"iptables-alerter-nm928\" (UID: \"f8d9bbd2-cf59-434c-a2f6-2abd36d2113b\") " pod="openshift-network-operator/iptables-alerter-nm928" Apr 23 17:55:20.130657 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.130511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8d9bbd2-cf59-434c-a2f6-2abd36d2113b-host-slash\") pod \"iptables-alerter-nm928\" (UID: \"f8d9bbd2-cf59-434c-a2f6-2abd36d2113b\") " pod="openshift-network-operator/iptables-alerter-nm928" Apr 23 17:55:20.231808 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.231769 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8d9bbd2-cf59-434c-a2f6-2abd36d2113b-host-slash\") pod \"iptables-alerter-nm928\" (UID: \"f8d9bbd2-cf59-434c-a2f6-2abd36d2113b\") " pod="openshift-network-operator/iptables-alerter-nm928" Apr 23 17:55:20.232013 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.231883 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8d9bbd2-cf59-434c-a2f6-2abd36d2113b-iptables-alerter-script\") pod \"iptables-alerter-nm928\" (UID: \"f8d9bbd2-cf59-434c-a2f6-2abd36d2113b\") " pod="openshift-network-operator/iptables-alerter-nm928" Apr 23 17:55:20.232013 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.231905 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8d9bbd2-cf59-434c-a2f6-2abd36d2113b-host-slash\") pod \"iptables-alerter-nm928\" (UID: \"f8d9bbd2-cf59-434c-a2f6-2abd36d2113b\") " pod="openshift-network-operator/iptables-alerter-nm928" Apr 23 17:55:20.232013 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.231914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7xh2\" (UniqueName: \"kubernetes.io/projected/f8d9bbd2-cf59-434c-a2f6-2abd36d2113b-kube-api-access-n7xh2\") pod \"iptables-alerter-nm928\" (UID: \"f8d9bbd2-cf59-434c-a2f6-2abd36d2113b\") " pod="openshift-network-operator/iptables-alerter-nm928" Apr 23 17:55:20.232514 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.232490 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8d9bbd2-cf59-434c-a2f6-2abd36d2113b-iptables-alerter-script\") pod \"iptables-alerter-nm928\" (UID: \"f8d9bbd2-cf59-434c-a2f6-2abd36d2113b\") " pod="openshift-network-operator/iptables-alerter-nm928" Apr 23 17:55:20.250235 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.250207 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7xh2\" (UniqueName: \"kubernetes.io/projected/f8d9bbd2-cf59-434c-a2f6-2abd36d2113b-kube-api-access-n7xh2\") pod \"iptables-alerter-nm928\" (UID: \"f8d9bbd2-cf59-434c-a2f6-2abd36d2113b\") " pod="openshift-network-operator/iptables-alerter-nm928" Apr 23 17:55:20.314187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.314090 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nm928" Apr 23 17:55:20.322572 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:55:20.322533 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8d9bbd2_cf59_434c_a2f6_2abd36d2113b.slice/crio-94b2e6817046b303a16798214f73ca4c663e63886a5ba87d0c17f09e65bc2451 WatchSource:0}: Error finding container 94b2e6817046b303a16798214f73ca4c663e63886a5ba87d0c17f09e65bc2451: Status 404 returned error can't find the container with id 94b2e6817046b303a16798214f73ca4c663e63886a5ba87d0c17f09e65bc2451 Apr 23 17:55:20.433195 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:20.433122 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgh75\" (UniqueName: \"kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75\") pod \"network-check-target-qxm9s\" (UID: \"76db35ef-2b60-42ed-bb55-38ec1534f90f\") " pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:20.433372 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:20.433302 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:20.433372 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:20.433325 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:20.433372 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:20.433339 2579 projected.go:194] Error preparing data for projected volume kube-api-access-sgh75 for pod openshift-network-diagnostics/network-check-target-qxm9s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:20.433532 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:20.433414 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75 podName:76db35ef-2b60-42ed-bb55-38ec1534f90f nodeName:}" failed. No retries permitted until 2026-04-23 17:55:24.433393624 +0000 UTC m=+169.248759312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgh75" (UniqueName: "kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75") pod "network-check-target-qxm9s" (UID: "76db35ef-2b60-42ed-bb55-38ec1534f90f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:20.754587 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:20.754552 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:21.084137 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:21.084050 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nm928" event={"ID":"f8d9bbd2-cf59-434c-a2f6-2abd36d2113b","Type":"ContainerStarted","Data":"94b2e6817046b303a16798214f73ca4c663e63886a5ba87d0c17f09e65bc2451"} Apr 23 17:55:21.798681 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:21.798640 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:21.798681 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:21.798665 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:21.799166 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:21.798790 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:21.799166 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:21.798943 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:23.799010 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:23.798972 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:23.799450 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:23.799109 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:23.799450 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:23.799146 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:23.799450 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:23.799286 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:24.461243 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:24.461195 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgh75\" (UniqueName: \"kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75\") pod \"network-check-target-qxm9s\" (UID: \"76db35ef-2b60-42ed-bb55-38ec1534f90f\") " pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:24.461428 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:24.461412 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:24.461503 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:24.461436 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:24.461503 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:24.461446 2579 projected.go:194] Error preparing data for projected volume kube-api-access-sgh75 for pod openshift-network-diagnostics/network-check-target-qxm9s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:24.461604 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:24.461513 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75 podName:76db35ef-2b60-42ed-bb55-38ec1534f90f nodeName:}" failed. No retries permitted until 2026-04-23 17:55:32.46149496 +0000 UTC m=+177.276860603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgh75" (UniqueName: "kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75") pod "network-check-target-qxm9s" (UID: "76db35ef-2b60-42ed-bb55-38ec1534f90f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:25.755646 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:25.755602 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:25.800028 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:25.799993 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:25.800203 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:25.800175 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:25.800300 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:25.800225 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:25.800395 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:25.800327 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:27.798748 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:27.798709 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:27.798748 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:27.798707 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:27.799277 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:27.798981 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:27.799277 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:27.799098 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:29.100083 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:29.099808 2579 generic.go:358] "Generic (PLEG): container finished" podID="fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b" containerID="7db9a509320e3ae3b7931bdbe00c32d7009887f2626318de3380137adfb069d6" exitCode=0 Apr 23 17:55:29.100769 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:29.099861 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" event={"ID":"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b","Type":"ContainerDied","Data":"7db9a509320e3ae3b7931bdbe00c32d7009887f2626318de3380137adfb069d6"} Apr 23 17:55:29.102794 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:29.102767 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerStarted","Data":"0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796"} Apr 23 17:55:29.102919 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:29.102799 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerStarted","Data":"16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b"} Apr 23 17:55:29.102919 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:29.102808 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerStarted","Data":"78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43"} Apr 23 17:55:29.102919 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:29.102817 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerStarted","Data":"9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27"} Apr 23 17:55:29.102919 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:29.102847 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerStarted","Data":"2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f"} Apr 23 17:55:29.102919 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:29.102856 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerStarted","Data":"9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b"} Apr 23 17:55:29.798716 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:29.798682 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:29.798891 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:29.798800 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:29.798891 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:29.798869 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:29.798998 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:29.798978 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:30.106101 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:30.106068 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nm928" event={"ID":"f8d9bbd2-cf59-434c-a2f6-2abd36d2113b","Type":"ContainerStarted","Data":"d4827c66dbb3e23c6bc542dec0f395ed3d4486680ef42bb7cdb7f775767deec5"} Apr 23 17:55:30.108603 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:30.108578 2579 generic.go:358] "Generic (PLEG): container finished" podID="fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b" containerID="a1a5235bc879c65c0c34a47416ca98464c0884b7dde99ce5e14ad5c3015d4788" exitCode=0 Apr 23 17:55:30.108716 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:30.108612 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" event={"ID":"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b","Type":"ContainerDied","Data":"a1a5235bc879c65c0c34a47416ca98464c0884b7dde99ce5e14ad5c3015d4788"} Apr 23 17:55:30.176529 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:30.176410 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-nm928" podStartSLOduration=2.953558473 podStartE2EDuration="11.176392327s" podCreationTimestamp="2026-04-23 17:55:19 +0000 UTC" firstStartedPulling="2026-04-23 17:55:20.324673042 +0000 UTC m=+165.140038682" lastFinishedPulling="2026-04-23 17:55:28.547506911 +0000 UTC m=+173.362872536" observedRunningTime="2026-04-23 17:55:30.130150262 +0000 UTC m=+174.945515911" watchObservedRunningTime="2026-04-23 17:55:30.176392327 +0000 UTC m=+174.991757975" Apr 23 17:55:30.756610 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:30.756582 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:31.113390 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:31.113347 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" event={"ID":"fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b","Type":"ContainerStarted","Data":"ecf3bf2454a3ff528b32ce4ecd6156e8279038bcfa0106dcbee415fe2b7fb8fd"} Apr 23 17:55:31.116300 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:31.116273 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerStarted","Data":"824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e"} Apr 23 17:55:31.798703 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:31.798663 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:31.798894 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:31.798810 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:31.798894 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:31.798862 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:31.799016 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:31.798993 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:32.512572 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:32.512530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgh75\" (UniqueName: \"kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75\") pod \"network-check-target-qxm9s\" (UID: \"76db35ef-2b60-42ed-bb55-38ec1534f90f\") " pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:32.512972 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:32.512709 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:55:32.512972 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:32.512731 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:55:32.512972 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:32.512744 2579 projected.go:194] Error preparing data for projected volume kube-api-access-sgh75 for pod openshift-network-diagnostics/network-check-target-qxm9s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:32.512972 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:32.512798 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75 podName:76db35ef-2b60-42ed-bb55-38ec1534f90f nodeName:}" failed. No retries permitted until 2026-04-23 17:55:48.512784653 +0000 UTC m=+193.328150278 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-sgh75" (UniqueName: "kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75") pod "network-check-target-qxm9s" (UID: "76db35ef-2b60-42ed-bb55-38ec1534f90f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:55:33.799020 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:33.798983 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:33.799557 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:33.798983 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:33.799557 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:33.799117 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:33.799557 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:33.799181 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:34.125685 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:34.125601 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerStarted","Data":"9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c"} Apr 23 17:55:34.154300 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:34.154254 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p4bt4" podStartSLOduration=6.906122073 podStartE2EDuration="31.154241634s" podCreationTimestamp="2026-04-23 17:55:03 +0000 UTC" firstStartedPulling="2026-04-23 17:55:03.943491209 +0000 UTC m=+148.758856835" lastFinishedPulling="2026-04-23 17:55:28.19161077 +0000 UTC m=+173.006976396" observedRunningTime="2026-04-23 17:55:31.138095594 +0000 UTC m=+175.953461242" watchObservedRunningTime="2026-04-23 17:55:34.154241634 +0000 UTC m=+178.969607281" Apr 23 17:55:34.154452 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:34.154384 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" podStartSLOduration=6.944943944 podStartE2EDuration="21.154380056s" podCreationTimestamp="2026-04-23 17:55:13 +0000 UTC" firstStartedPulling="2026-04-23 17:55:13.982337836 +0000 UTC m=+158.797703463" lastFinishedPulling="2026-04-23 17:55:28.191773947 +0000 UTC m=+173.007139575" observedRunningTime="2026-04-23 17:55:34.152663856 +0000 UTC m=+178.968029503" watchObservedRunningTime="2026-04-23 17:55:34.154380056 +0000 UTC m=+178.969745720" Apr 23 17:55:35.052894 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:35.052865 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qxm9s"] Apr 23 17:55:35.053260 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:35.052984 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:35.053260 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:35.053085 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:35.055425 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:35.055400 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gnxs9"] Apr 23 17:55:35.055529 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:35.055495 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:35.055582 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:35.055568 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:35.127893 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:35.127766 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:35.127893 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:35.127804 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:35.127893 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:35.127863 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:35.143215 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:35.143191 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:35.143382 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:35.143368 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:35.632078 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:35.632031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:35.632267 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:35.632156 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:35.632267 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:35.632210 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs podName:67c4e3ae-cc88-433d-8549-c77153e2e1d6 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:07.632196916 +0000 UTC m=+212.447562543 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs") pod "network-metrics-daemon-gnxs9" (UID: "67c4e3ae-cc88-433d-8549-c77153e2e1d6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:55:35.757557 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:35.757522 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:36.798870 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:36.798813 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:36.799300 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:36.798813 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:36.799300 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:36.798965 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:36.799300 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:36.799037 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:38.477694 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.477655 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ftvrg"] Apr 23 17:55:38.478317 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.478151 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovn-controller" containerID="cri-o://9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b" gracePeriod=30 Apr 23 17:55:38.478317 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.478235 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="kube-rbac-proxy-node" containerID="cri-o://9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27" gracePeriod=30 Apr 23 17:55:38.478317 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.478262 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="northd" containerID="cri-o://16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b" gracePeriod=30 Apr 23 17:55:38.478317 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.478209 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="nbdb" containerID="cri-o://0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796" gracePeriod=30 Apr 23 17:55:38.478317 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.478290 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovn-acl-logging" containerID="cri-o://2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f" gracePeriod=30 Apr 23 17:55:38.478317 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.478233 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43" gracePeriod=30 Apr 23 17:55:38.478589 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.478296 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="sbdb" containerID="cri-o://824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e" gracePeriod=30 Apr 23 17:55:38.491319 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.491260 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovnkube-controller" containerID="cri-o://9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c" gracePeriod=30 Apr 23 17:55:38.494074 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.494022 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovnkube-controller" probeResult="failure" output="" Apr 23 17:55:38.803164 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.803073 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:38.803164 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:38.803092 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:38.803412 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:38.803217 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:38.803542 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:38.803517 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:39.137534 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.137462 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:55:39.137883 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.137869 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/kube-rbac-proxy-node/0.log" Apr 23 17:55:39.138260 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138241 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/ovn-acl-logging/0.log" Apr 23 17:55:39.138635 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138623 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/ovn-controller/0.log" Apr 23 17:55:39.138702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138656 2579 generic.go:358] "Generic (PLEG): container finished" podID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerID="824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e" exitCode=0 Apr 23 17:55:39.138702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138672 2579 generic.go:358] "Generic (PLEG): container finished" podID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerID="0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796" exitCode=0 Apr 23 17:55:39.138702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138684 2579 generic.go:358] "Generic (PLEG): container finished" podID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerID="16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b" exitCode=0 Apr 23 17:55:39.138702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138692 2579 generic.go:358] "Generic (PLEG): container finished" podID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerID="78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43" exitCode=143 Apr 23 17:55:39.138702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138697 2579 generic.go:358] "Generic (PLEG): container finished" podID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerID="9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27" exitCode=143 Apr 23 17:55:39.138702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138702 2579 generic.go:358] "Generic (PLEG): container finished" podID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerID="2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f" exitCode=143 Apr 23 17:55:39.138952 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138707 2579 generic.go:358] "Generic (PLEG): container finished" podID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerID="9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b" exitCode=143 Apr 23 17:55:39.138952 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138727 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerDied","Data":"824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e"} Apr 23 17:55:39.138952 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138755 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerDied","Data":"0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796"} Apr 23 17:55:39.138952 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138767 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerDied","Data":"16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b"} Apr 23 17:55:39.138952 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138776 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerDied","Data":"78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43"} Apr 23 17:55:39.138952 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138784 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerDied","Data":"9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27"} Apr 23 17:55:39.138952 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138792 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerDied","Data":"2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f"} Apr 23 17:55:39.138952 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.138801 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerDied","Data":"9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b"} Apr 23 17:55:39.945513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.945491 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/ovnkube-controller/0.log" Apr 23 17:55:39.946697 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.946679 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:55:39.947039 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.947025 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/kube-rbac-proxy-node/0.log" Apr 23 17:55:39.947333 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.947322 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/ovn-acl-logging/0.log" Apr 23 17:55:39.947706 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.947694 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/ovn-controller/0.log" Apr 23 17:55:39.947806 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.947795 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:39.961226 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961211 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-netd\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961275 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961243 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-systemd\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961275 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961250 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961339 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961295 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-bin\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961339 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961312 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961339 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961333 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961431 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961348 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961431 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961394 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-netns\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961431 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961423 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovn-node-metrics-cert\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961553 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961444 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-node-log\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961553 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961445 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961553 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961469 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-env-overrides\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961553 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961511 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-var-lib-openvswitch\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961553 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961506 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-node-log" (OuterVolumeSpecName: "node-log") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961553 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961527 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-ovn\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961553 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961546 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-config\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961567 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-slash\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961580 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-kubelet\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961600 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961607 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-systemd-units\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961636 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961640 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-slash" (OuterVolumeSpecName: "host-slash") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961652 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-script-lib\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961670 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961679 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961682 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-openvswitch\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961707 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961713 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-etc-openvswitch\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961742 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h2kb\" (UniqueName: \"kubernetes.io/projected/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-kube-api-access-2h2kb\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961768 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-log-socket\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961790 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-ovn-kubernetes\") pod \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\" (UID: \"ad88c127-ee8d-4859-a2a7-6bfbe501c33f\") " Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961806 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-log-socket" (OuterVolumeSpecName: "log-socket") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.961953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961852 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961891 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961914 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961944 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961955 2579 reconciler_common.go:299] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-slash\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961974 2579 reconciler_common.go:299] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-kubelet\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.961987 2579 reconciler_common.go:299] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-systemd-units\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962002 2579 reconciler_common.go:299] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-openvswitch\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962016 2579 reconciler_common.go:299] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-etc-openvswitch\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962030 2579 reconciler_common.go:299] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-log-socket\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962044 2579 reconciler_common.go:299] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-netd\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962054 2579 reconciler_common.go:299] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-cni-bin\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962063 2579 reconciler_common.go:299] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962072 2579 reconciler_common.go:299] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-netns\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962080 2579 reconciler_common.go:299] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-node-log\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962088 2579 reconciler_common.go:299] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-var-lib-openvswitch\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962028 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962100 2579 reconciler_common.go:299] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-ovn\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.962513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.962125 2579 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-config\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:39.964365 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.964347 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:55:39.964658 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.964639 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-kube-api-access-2h2kb" (OuterVolumeSpecName: "kube-api-access-2h2kb") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "kube-api-access-2h2kb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:55:39.965215 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:39.965200 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ad88c127-ee8d-4859-a2a7-6bfbe501c33f" (UID: "ad88c127-ee8d-4859-a2a7-6bfbe501c33f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:55:40.013705 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013676 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9sxcg"] Apr 23 17:55:40.013839 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013814 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovn-controller" Apr 23 17:55:40.013880 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013846 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovn-controller" Apr 23 17:55:40.013880 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013856 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="northd" Apr 23 17:55:40.013880 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013862 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="northd" Apr 23 17:55:40.013880 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013868 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="kube-rbac-proxy-node" Apr 23 17:55:40.013880 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013873 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="kube-rbac-proxy-node" Apr 23 17:55:40.013880 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013880 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="sbdb" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013885 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="sbdb" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013892 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovn-acl-logging" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013898 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovn-acl-logging" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013903 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="kube-rbac-proxy-ovn-metrics" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013908 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="kube-rbac-proxy-ovn-metrics" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013914 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="nbdb" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013918 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="nbdb" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013924 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovnkube-controller" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013929 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovnkube-controller" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013957 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovnkube-controller" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013964 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="kube-rbac-proxy-node" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013969 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="nbdb" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013974 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="sbdb" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013978 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovn-acl-logging" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013983 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="kube-rbac-proxy-ovn-metrics" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013989 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="northd" Apr 23 17:55:40.014045 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.013995 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerName="ovn-controller" Apr 23 17:55:40.053680 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.053629 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.064596 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.064572 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-cni-netd\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.064709 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.064618 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-node-log\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.064709 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.064643 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-ovn-node-metrics-cert\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.064709 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.064669 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-var-lib-openvswitch\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.064847 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.064743 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-kubelet\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.064847 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.064785 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-systemd-units\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.064847 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.064812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-etc-openvswitch\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065016 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.064893 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-run-netns\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065016 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.064940 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-slash\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065016 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.064965 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-run-openvswitch\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065016 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.064989 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065183 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065017 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-run-ovn-kubernetes\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065183 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-env-overrides\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065183 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065064 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-run-ovn\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065183 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065091 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-ovnkube-script-lib\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065183 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065114 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-log-socket\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065183 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065141 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-cni-bin\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065183 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-ovnkube-config\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065183 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065177 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwg4v\" (UniqueName: \"kubernetes.io/projected/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-kube-api-access-mwg4v\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065472 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065205 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-run-systemd\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.065472 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065252 2579 reconciler_common.go:299] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-host-run-ovn-kubernetes\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:40.065472 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065271 2579 reconciler_common.go:299] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-run-systemd\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:40.065472 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065285 2579 reconciler_common.go:299] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovn-node-metrics-cert\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:40.065472 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065301 2579 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-env-overrides\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:40.065472 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065316 2579 reconciler_common.go:299] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-ovnkube-script-lib\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:40.065472 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.065332 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2h2kb\" (UniqueName: \"kubernetes.io/projected/ad88c127-ee8d-4859-a2a7-6bfbe501c33f-kube-api-access-2h2kb\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:55:40.142501 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.142470 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/ovnkube-controller/0.log" Apr 23 17:55:40.143697 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.143677 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:55:40.144047 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.144034 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/kube-rbac-proxy-node/0.log" Apr 23 17:55:40.144424 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.144404 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/ovn-acl-logging/0.log" Apr 23 17:55:40.144914 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.144900 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ftvrg_ad88c127-ee8d-4859-a2a7-6bfbe501c33f/ovn-controller/0.log" Apr 23 17:55:40.144989 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.144934 2579 generic.go:358] "Generic (PLEG): container finished" podID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" containerID="9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c" exitCode=1 Apr 23 17:55:40.144989 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.144969 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerDied","Data":"9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c"} Apr 23 17:55:40.145089 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.144992 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" event={"ID":"ad88c127-ee8d-4859-a2a7-6bfbe501c33f","Type":"ContainerDied","Data":"85f67c62c9f5b7199ee35dfb47e4defeaf8c2469141dc33beb5b282537727af5"} Apr 23 17:55:40.145089 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.145007 2579 scope.go:117] "RemoveContainer" containerID="9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c" Apr 23 17:55:40.145173 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.145131 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ftvrg" Apr 23 17:55:40.154244 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.154225 2579 scope.go:117] "RemoveContainer" containerID="824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e" Apr 23 17:55:40.160579 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.160556 2579 scope.go:117] "RemoveContainer" containerID="0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796" Apr 23 17:55:40.165858 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.165809 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-log-socket\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.165958 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.165887 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-cni-bin\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.165958 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.165917 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-ovnkube-config\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.165958 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.165944 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwg4v\" (UniqueName: \"kubernetes.io/projected/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-kube-api-access-mwg4v\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166115 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.165963 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-log-socket\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166115 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.165974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-run-systemd\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166115 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.165966 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-cni-bin\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166115 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166008 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-cni-netd\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166115 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166051 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-cni-netd\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166115 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166058 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-node-log\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166115 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166089 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-ovn-node-metrics-cert\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166122 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-var-lib-openvswitch\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-kubelet\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-systemd-units\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166215 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-etc-openvswitch\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166245 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-run-netns\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-slash\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166297 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-kubelet\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166301 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-run-openvswitch\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166094 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-run-systemd\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166123 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-node-log\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166328 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166365 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-var-lib-openvswitch\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166374 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-run-netns\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.166399 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166403 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-systemd-units\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-run-openvswitch\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166429 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-slash\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166444 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-etc-openvswitch\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166453 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-run-ovn-kubernetes\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-host-run-ovn-kubernetes\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-env-overrides\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166523 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-ovnkube-config\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166541 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-run-ovn\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166573 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-ovnkube-script-lib\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166625 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-run-ovn\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.166778 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-env-overrides\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167187 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.167027 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-ovnkube-script-lib\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.167795 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.167211 2579 scope.go:117] "RemoveContainer" containerID="16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b" Apr 23 17:55:40.169132 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.169113 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-ovn-node-metrics-cert\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.173363 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.173346 2579 scope.go:117] "RemoveContainer" containerID="78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43" Apr 23 17:55:40.176980 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.176954 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwg4v\" (UniqueName: \"kubernetes.io/projected/0f7a35cb-32ab-4b1d-a1de-f711ce9b0045-kube-api-access-mwg4v\") pod \"ovnkube-node-9sxcg\" (UID: \"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045\") " pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.179535 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.179520 2579 scope.go:117] "RemoveContainer" containerID="9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27" Apr 23 17:55:40.185433 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.185414 2579 scope.go:117] "RemoveContainer" containerID="2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f" Apr 23 17:55:40.186269 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.186213 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ftvrg"] Apr 23 17:55:40.191077 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.191057 2579 scope.go:117] "RemoveContainer" containerID="9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b" Apr 23 17:55:40.195674 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.195654 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ftvrg"] Apr 23 17:55:40.197913 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.197894 2579 scope.go:117] "RemoveContainer" containerID="9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c" Apr 23 17:55:40.198173 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:40.198155 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c\": container with ID starting with 9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c not found: ID does not exist" containerID="9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c" Apr 23 17:55:40.198224 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.198184 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c"} err="failed to get container status \"9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c\": rpc error: code = NotFound desc = could not find container \"9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c\": container with ID starting with 9952ce52a2ae49f759153b7a647a97c4a1dbe3ddf2085fa0d7b658dcf1a2e90c not found: ID does not exist" Apr 23 17:55:40.198224 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.198217 2579 scope.go:117] "RemoveContainer" containerID="824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e" Apr 23 17:55:40.198446 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:40.198432 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e\": container with ID starting with 824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e not found: ID does not exist" containerID="824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e" Apr 23 17:55:40.198490 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.198449 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e"} err="failed to get container status \"824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e\": rpc error: code = NotFound desc = could not find container \"824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e\": container with ID starting with 824443c5de3cc93d89748022d1fc1ac35a30203113ab00ca59eb0945747cdd5e not found: ID does not exist" Apr 23 17:55:40.198490 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.198462 2579 scope.go:117] "RemoveContainer" containerID="0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796" Apr 23 17:55:40.198657 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:40.198644 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796\": container with ID starting with 0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796 not found: ID does not exist" containerID="0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796" Apr 23 17:55:40.198690 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.198660 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796"} err="failed to get container status \"0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796\": rpc error: code = NotFound desc = could not find container \"0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796\": container with ID starting with 0ca4339de19718603c17dd43cbbff209fff2c380f2e420be4471079a544f1796 not found: ID does not exist" Apr 23 17:55:40.198690 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.198672 2579 scope.go:117] "RemoveContainer" containerID="16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b" Apr 23 17:55:40.198868 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:40.198850 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b\": container with ID starting with 16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b not found: ID does not exist" containerID="16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b" Apr 23 17:55:40.198903 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.198874 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b"} err="failed to get container status \"16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b\": rpc error: code = NotFound desc = could not find container \"16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b\": container with ID starting with 16ded5a654c1c900211a6f3a04335d167ddba327ae3673de51191a9c17f5d50b not found: ID does not exist" Apr 23 17:55:40.198903 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.198891 2579 scope.go:117] "RemoveContainer" containerID="78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43" Apr 23 17:55:40.199106 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:40.199092 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43\": container with ID starting with 78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43 not found: ID does not exist" containerID="78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43" Apr 23 17:55:40.199148 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.199111 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43"} err="failed to get container status \"78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43\": rpc error: code = NotFound desc = could not find container \"78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43\": container with ID starting with 78d107c1ac76a282f8ae4a170a783fb552254356f6c69143a789f36e6bf24c43 not found: ID does not exist" Apr 23 17:55:40.199148 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.199123 2579 scope.go:117] "RemoveContainer" containerID="9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27" Apr 23 17:55:40.199316 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:40.199302 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27\": container with ID starting with 9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27 not found: ID does not exist" containerID="9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27" Apr 23 17:55:40.199360 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.199319 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27"} err="failed to get container status \"9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27\": rpc error: code = NotFound desc = could not find container \"9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27\": container with ID starting with 9ba8ddc5fa594efe44629a779946aa72140e9e78f9942e718393204df0739c27 not found: ID does not exist" Apr 23 17:55:40.199360 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.199330 2579 scope.go:117] "RemoveContainer" containerID="2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f" Apr 23 17:55:40.199496 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:40.199481 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f\": container with ID starting with 2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f not found: ID does not exist" containerID="2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f" Apr 23 17:55:40.199534 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.199498 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f"} err="failed to get container status \"2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f\": rpc error: code = NotFound desc = could not find container \"2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f\": container with ID starting with 2de98d858f8e78ff6aca2ca9afaf2e1df55cadf8c33070c00091e6fbc469591f not found: ID does not exist" Apr 23 17:55:40.199534 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.199509 2579 scope.go:117] "RemoveContainer" containerID="9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b" Apr 23 17:55:40.199690 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:40.199676 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b\": container with ID starting with 9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b not found: ID does not exist" containerID="9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b" Apr 23 17:55:40.199729 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.199693 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b"} err="failed to get container status \"9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b\": rpc error: code = NotFound desc = could not find container \"9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b\": container with ID starting with 9706dd91760c54ddea7abf711bf53385a2ff4273f0835d46e97b47da9fb0245b not found: ID does not exist" Apr 23 17:55:40.361909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.361789 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:40.369308 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:55:40.369278 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f7a35cb_32ab_4b1d_a1de_f711ce9b0045.slice/crio-f702ada663a45e81e0102633e45c70694eded988b8dc40af73d96756d004b279 WatchSource:0}: Error finding container f702ada663a45e81e0102633e45c70694eded988b8dc40af73d96756d004b279: Status 404 returned error can't find the container with id f702ada663a45e81e0102633e45c70694eded988b8dc40af73d96756d004b279 Apr 23 17:55:40.758043 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:40.758015 2579 kubelet.go:3132] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Apr 23 17:55:40.798648 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.798619 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:40.798786 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:40.798619 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:40.798857 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:40.798818 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:40.798986 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:40.798960 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:41.150282 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:41.150241 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" event={"ID":"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045","Type":"ContainerStarted","Data":"b2602c675f3bf16f2935fd1cf0d1309deb2bfd72174d70654472c46b26048cf7"} Apr 23 17:55:41.150282 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:41.150277 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" event={"ID":"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045","Type":"ContainerStarted","Data":"dc8850481ce01e0f1e4e60566be4772461a02eef51c3f5940fb4ce3071233e65"} Apr 23 17:55:41.150282 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:41.150287 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" event={"ID":"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045","Type":"ContainerStarted","Data":"0a85e43964bfaa38bb4a33e5c5024f478936d733fcca9cf3cc69803797e04085"} Apr 23 17:55:41.150666 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:41.150294 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" event={"ID":"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045","Type":"ContainerStarted","Data":"022c0833a1fbcd7d68200eb8eb8fffefea653db2b0642bcd3f11ee9cf6805df8"} Apr 23 17:55:41.150666 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:41.150304 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" event={"ID":"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045","Type":"ContainerStarted","Data":"dd181627c569b00496a384b1e72ec6765e2e2f9c1e4e1b5816f622696d648f6c"} Apr 23 17:55:41.150666 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:41.150316 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" event={"ID":"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045","Type":"ContainerStarted","Data":"18191e88ff6e10d4023cfdede08dc9e07e09ba942603b0cc14e0283b20ec859f"} Apr 23 17:55:41.150666 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:41.150325 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" event={"ID":"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045","Type":"ContainerStarted","Data":"f702ada663a45e81e0102633e45c70694eded988b8dc40af73d96756d004b279"} Apr 23 17:55:41.801261 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:41.801230 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad88c127-ee8d-4859-a2a7-6bfbe501c33f" path="/var/lib/kubelet/pods/ad88c127-ee8d-4859-a2a7-6bfbe501c33f/volumes" Apr 23 17:55:42.798864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:42.798841 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:42.799100 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:42.798945 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:42.799100 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:42.798998 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:42.799173 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:42.799119 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:43.156409 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:43.156322 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" event={"ID":"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045","Type":"ContainerStarted","Data":"6b8f6840902d0af8d3c099b6313c4f9700dfa94b5992a8b8bee61ca046a50a5e"} Apr 23 17:55:44.799018 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:44.798980 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:44.799470 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:44.798989 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:44.799470 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:44.799096 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxm9s" podUID="76db35ef-2b60-42ed-bb55-38ec1534f90f" Apr 23 17:55:44.799470 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:44.799155 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gnxs9" podUID="67c4e3ae-cc88-433d-8549-c77153e2e1d6" Apr 23 17:55:45.163986 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:45.163611 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" event={"ID":"0f7a35cb-32ab-4b1d-a1de-f711ce9b0045","Type":"ContainerStarted","Data":"3f01a24f0078cfe2d755c6a70b9155cf42b02bc838efda2d9e69dcb177b800ab"} Apr 23 17:55:45.163986 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:45.163875 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:45.163986 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:45.163894 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:45.178646 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:45.178624 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:45.196758 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:45.196706 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" podStartSLOduration=6.196690644 podStartE2EDuration="6.196690644s" podCreationTimestamp="2026-04-23 17:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:55:45.196397297 +0000 UTC m=+190.011762950" watchObservedRunningTime="2026-04-23 17:55:45.196690644 +0000 UTC m=+190.012056292" Apr 23 17:55:46.166520 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:46.166491 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:46.180371 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:46.180348 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:55:46.798798 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:46.798757 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:46.798981 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:46.798857 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:55:46.801576 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:46.801555 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:55:46.801662 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:46.801589 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:55:46.802985 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:46.802969 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:55:46.802985 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:46.802983 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-s85dw\"" Apr 23 17:55:46.803113 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:46.803013 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-brp64\"" Apr 23 17:55:48.518046 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:48.517987 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgh75\" (UniqueName: \"kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75\") pod \"network-check-target-qxm9s\" (UID: \"76db35ef-2b60-42ed-bb55-38ec1534f90f\") " pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:48.520863 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:48.520816 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgh75\" (UniqueName: \"kubernetes.io/projected/76db35ef-2b60-42ed-bb55-38ec1534f90f-kube-api-access-sgh75\") pod \"network-check-target-qxm9s\" (UID: \"76db35ef-2b60-42ed-bb55-38ec1534f90f\") " pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:48.608865 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:48.608807 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:48.740578 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:48.740547 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qxm9s"] Apr 23 17:55:48.744435 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:55:48.744265 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76db35ef_2b60_42ed_bb55_38ec1534f90f.slice/crio-226c5668e0d8894bc8b109f8e89f35c9c3590698def824ec43dd49c093920dfe WatchSource:0}: Error finding container 226c5668e0d8894bc8b109f8e89f35c9c3590698def824ec43dd49c093920dfe: Status 404 returned error can't find the container with id 226c5668e0d8894bc8b109f8e89f35c9c3590698def824ec43dd49c093920dfe Apr 23 17:55:49.173347 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.173305 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qxm9s" event={"ID":"76db35ef-2b60-42ed-bb55-38ec1534f90f","Type":"ContainerStarted","Data":"226c5668e0d8894bc8b109f8e89f35c9c3590698def824ec43dd49c093920dfe"} Apr 23 17:55:49.690844 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.690804 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-c9dz6"] Apr 23 17:55:49.697401 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.697376 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c9dz6" Apr 23 17:55:49.702691 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.702673 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:55:49.703446 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.703411 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:55:49.703568 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.703414 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qcv89\"" Apr 23 17:55:49.727098 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.727064 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/47e47db9-35db-473d-b6e8-181ec486420c-hosts-file\") pod \"node-resolver-c9dz6\" (UID: \"47e47db9-35db-473d-b6e8-181ec486420c\") " pod="openshift-dns/node-resolver-c9dz6" Apr 23 17:55:49.727224 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.727117 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvbtx\" (UniqueName: \"kubernetes.io/projected/47e47db9-35db-473d-b6e8-181ec486420c-kube-api-access-bvbtx\") pod \"node-resolver-c9dz6\" (UID: \"47e47db9-35db-473d-b6e8-181ec486420c\") " pod="openshift-dns/node-resolver-c9dz6" Apr 23 17:55:49.727224 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.727185 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/47e47db9-35db-473d-b6e8-181ec486420c-tmp-dir\") pod \"node-resolver-c9dz6\" (UID: \"47e47db9-35db-473d-b6e8-181ec486420c\") " pod="openshift-dns/node-resolver-c9dz6" Apr 23 17:55:49.828374 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.828347 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/47e47db9-35db-473d-b6e8-181ec486420c-hosts-file\") pod \"node-resolver-c9dz6\" (UID: \"47e47db9-35db-473d-b6e8-181ec486420c\") " pod="openshift-dns/node-resolver-c9dz6" Apr 23 17:55:49.828560 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.828381 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvbtx\" (UniqueName: \"kubernetes.io/projected/47e47db9-35db-473d-b6e8-181ec486420c-kube-api-access-bvbtx\") pod \"node-resolver-c9dz6\" (UID: \"47e47db9-35db-473d-b6e8-181ec486420c\") " pod="openshift-dns/node-resolver-c9dz6" Apr 23 17:55:49.828560 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.828419 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/47e47db9-35db-473d-b6e8-181ec486420c-tmp-dir\") pod \"node-resolver-c9dz6\" (UID: \"47e47db9-35db-473d-b6e8-181ec486420c\") " pod="openshift-dns/node-resolver-c9dz6" Apr 23 17:55:49.828560 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.828485 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/47e47db9-35db-473d-b6e8-181ec486420c-hosts-file\") pod \"node-resolver-c9dz6\" (UID: \"47e47db9-35db-473d-b6e8-181ec486420c\") " pod="openshift-dns/node-resolver-c9dz6" Apr 23 17:55:49.828774 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.828755 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/47e47db9-35db-473d-b6e8-181ec486420c-tmp-dir\") pod \"node-resolver-c9dz6\" (UID: \"47e47db9-35db-473d-b6e8-181ec486420c\") " pod="openshift-dns/node-resolver-c9dz6" Apr 23 17:55:49.840731 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.840705 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvbtx\" (UniqueName: \"kubernetes.io/projected/47e47db9-35db-473d-b6e8-181ec486420c-kube-api-access-bvbtx\") pod \"node-resolver-c9dz6\" (UID: \"47e47db9-35db-473d-b6e8-181ec486420c\") " pod="openshift-dns/node-resolver-c9dz6" Apr 23 17:55:49.992793 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:49.992719 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-202.ec2.internal" event="NodeReady" Apr 23 17:55:50.008627 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.008598 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c9dz6" Apr 23 17:55:50.021941 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:55:50.021907 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47e47db9_35db_473d_b6e8_181ec486420c.slice/crio-74b060de970d79866a98e43d53f31d1f51ea2ab8827ffce831674c9a7637caa8 WatchSource:0}: Error finding container 74b060de970d79866a98e43d53f31d1f51ea2ab8827ffce831674c9a7637caa8: Status 404 returned error can't find the container with id 74b060de970d79866a98e43d53f31d1f51ea2ab8827ffce831674c9a7637caa8 Apr 23 17:55:50.052448 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.052421 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6867cfcb6c-chk7c"] Apr 23 17:55:50.054111 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.054089 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.057203 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.057171 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 17:55:50.057544 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.057510 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 17:55:50.058061 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.058038 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 17:55:50.058374 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.058352 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f7p8x\"" Apr 23 17:55:50.063632 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.063232 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 17:55:50.069403 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.069345 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nq87q"] Apr 23 17:55:50.071596 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.071450 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-hfd28"] Apr 23 17:55:50.071596 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.071559 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.073275 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.073239 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.077554 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.076956 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:55:50.077554 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.077217 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-sysctl-allowlist\"" Apr 23 17:55:50.077554 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.077410 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:55:50.077554 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.077485 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6867cfcb6c-chk7c"] Apr 23 17:55:50.077785 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.077577 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4gw5f\"" Apr 23 17:55:50.082197 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.082162 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nq87q"] Apr 23 17:55:50.094054 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.093863 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qnq92"] Apr 23 17:55:50.095850 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.095803 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:55:50.099364 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.099310 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:55:50.100221 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.099666 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:55:50.100221 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.099965 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:55:50.100633 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.100615 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wh8vz\"" Apr 23 17:55:50.106483 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.106443 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qnq92"] Apr 23 17:55:50.130648 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.130628 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:55:50.130747 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.130657 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4adb4da-c47c-42d2-ba86-f2c0503ec968-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.130747 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.130679 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rthwk\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-kube-api-access-rthwk\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.130747 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.130698 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-registry-certificates\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.130957 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.130749 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-image-registry-private-configuration\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.130957 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.130884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.130957 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.130916 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-installation-pull-secrets\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.130957 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.130949 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4adb4da-c47c-42d2-ba86-f2c0503ec968-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.131117 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.130990 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-bound-sa-token\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.131117 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.131040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c4adb4da-c47c-42d2-ba86-f2c0503ec968-ready\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.131117 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.131058 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-trusted-ca\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.131117 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.131071 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-tmp-dir\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.131117 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.131093 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af35427b-d504-4ded-b482-891528bafad3-ca-trust-extracted\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.131302 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.131116 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.131302 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.131143 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9rc6\" (UniqueName: \"kubernetes.io/projected/c4adb4da-c47c-42d2-ba86-f2c0503ec968-kube-api-access-p9rc6\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.131302 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.131163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-config-volume\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.131302 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.131189 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkzz5\" (UniqueName: \"kubernetes.io/projected/676e7d7a-e558-49c3-bc63-788e4d3f9a19-kube-api-access-wkzz5\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:55:50.131302 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.131208 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r72lw\" (UniqueName: \"kubernetes.io/projected/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-kube-api-access-r72lw\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.176644 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.176612 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c9dz6" event={"ID":"47e47db9-35db-473d-b6e8-181ec486420c","Type":"ContainerStarted","Data":"adc00d2ac30409d714256126be7543babfabad719a26a9007bc3aa6fe400e0af"} Apr 23 17:55:50.176757 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.176656 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c9dz6" event={"ID":"47e47db9-35db-473d-b6e8-181ec486420c","Type":"ContainerStarted","Data":"74b060de970d79866a98e43d53f31d1f51ea2ab8827ffce831674c9a7637caa8"} Apr 23 17:55:50.193975 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.193696 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c9dz6" podStartSLOduration=1.193678984 podStartE2EDuration="1.193678984s" podCreationTimestamp="2026-04-23 17:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:55:50.191962169 +0000 UTC m=+195.007327822" watchObservedRunningTime="2026-04-23 17:55:50.193678984 +0000 UTC m=+195.009044633" Apr 23 17:55:50.231884 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.231844 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rthwk\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-kube-api-access-rthwk\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.232019 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.231899 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-registry-certificates\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.232019 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.231943 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-image-registry-private-configuration\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.232019 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.231991 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.232019 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-installation-pull-secrets\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.232218 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232032 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4adb4da-c47c-42d2-ba86-f2c0503ec968-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.232218 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232057 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-bound-sa-token\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.232218 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c4adb4da-c47c-42d2-ba86-f2c0503ec968-ready\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.232218 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-trusted-ca\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.232218 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-tmp-dir\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.232218 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af35427b-d504-4ded-b482-891528bafad3-ca-trust-extracted\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.232218 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232177 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.232218 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232195 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9rc6\" (UniqueName: \"kubernetes.io/projected/c4adb4da-c47c-42d2-ba86-f2c0503ec968-kube-api-access-p9rc6\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.232218 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232218 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-config-volume\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.232603 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232240 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkzz5\" (UniqueName: \"kubernetes.io/projected/676e7d7a-e558-49c3-bc63-788e4d3f9a19-kube-api-access-wkzz5\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:55:50.232603 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r72lw\" (UniqueName: \"kubernetes.io/projected/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-kube-api-access-r72lw\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.232603 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232280 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:55:50.232603 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232295 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4adb4da-c47c-42d2-ba86-f2c0503ec968-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.232791 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232713 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4adb4da-c47c-42d2-ba86-f2c0503ec968-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.232878 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232804 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c4adb4da-c47c-42d2-ba86-f2c0503ec968-ready\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.232930 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232910 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4adb4da-c47c-42d2-ba86-f2c0503ec968-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.233528 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.233028 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-config-volume\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.233528 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.233117 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:55:50.233528 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.233137 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-tmp-dir\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.233528 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.233150 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-registry-certificates\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.233528 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.233182 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls podName:af20cd75-f1bb-4dcd-b179-14bcb34e5ef1 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:50.733162729 +0000 UTC m=+195.548528367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls") pod "dns-default-nq87q" (UID: "af20cd75-f1bb-4dcd-b179-14bcb34e5ef1") : secret "dns-default-metrics-tls" not found Apr 23 17:55:50.233528 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.233232 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:55:50.233528 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.233245 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6867cfcb6c-chk7c: secret "image-registry-tls" not found Apr 23 17:55:50.233528 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.233287 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls podName:af35427b-d504-4ded-b482-891528bafad3 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:50.733271971 +0000 UTC m=+195.548637598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls") pod "image-registry-6867cfcb6c-chk7c" (UID: "af35427b-d504-4ded-b482-891528bafad3") : secret "image-registry-tls" not found Apr 23 17:55:50.233528 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.232911 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af35427b-d504-4ded-b482-891528bafad3-ca-trust-extracted\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.233528 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.233359 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:55:50.233528 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.233431 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert podName:676e7d7a-e558-49c3-bc63-788e4d3f9a19 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:50.733412922 +0000 UTC m=+195.548778549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert") pod "ingress-canary-qnq92" (UID: "676e7d7a-e558-49c3-bc63-788e4d3f9a19") : secret "canary-serving-cert" not found Apr 23 17:55:50.234108 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.233714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-trusted-ca\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.236316 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.236294 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-installation-pull-secrets\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.236410 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.236294 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-image-registry-private-configuration\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.242330 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.242281 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rthwk\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-kube-api-access-rthwk\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.244131 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.244083 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9rc6\" (UniqueName: \"kubernetes.io/projected/c4adb4da-c47c-42d2-ba86-f2c0503ec968-kube-api-access-p9rc6\") pod \"cni-sysctl-allowlist-ds-hfd28\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.244744 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.244724 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r72lw\" (UniqueName: \"kubernetes.io/projected/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-kube-api-access-r72lw\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.245572 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.245547 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkzz5\" (UniqueName: \"kubernetes.io/projected/676e7d7a-e558-49c3-bc63-788e4d3f9a19-kube-api-access-wkzz5\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:55:50.245733 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.245715 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-bound-sa-token\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.390754 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.390715 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:50.399130 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:55:50.399093 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4adb4da_c47c_42d2_ba86_f2c0503ec968.slice/crio-b10c5c6d3f3352a4b39aae65b0ec2f70ce2fd59f06974983aa826a2b614ed832 WatchSource:0}: Error finding container b10c5c6d3f3352a4b39aae65b0ec2f70ce2fd59f06974983aa826a2b614ed832: Status 404 returned error can't find the container with id b10c5c6d3f3352a4b39aae65b0ec2f70ce2fd59f06974983aa826a2b614ed832 Apr 23 17:55:50.736369 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.736330 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:55:50.736949 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.736407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:50.736949 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:50.736453 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:50.736949 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.736563 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:55:50.736949 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.736571 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:55:50.736949 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.736586 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:55:50.736949 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.736607 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6867cfcb6c-chk7c: secret "image-registry-tls" not found Apr 23 17:55:50.736949 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.736634 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls podName:af20cd75-f1bb-4dcd-b179-14bcb34e5ef1 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:51.736619968 +0000 UTC m=+196.551985594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls") pod "dns-default-nq87q" (UID: "af20cd75-f1bb-4dcd-b179-14bcb34e5ef1") : secret "dns-default-metrics-tls" not found Apr 23 17:55:50.736949 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.736647 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls podName:af35427b-d504-4ded-b482-891528bafad3 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:51.736641302 +0000 UTC m=+196.552006928 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls") pod "image-registry-6867cfcb6c-chk7c" (UID: "af35427b-d504-4ded-b482-891528bafad3") : secret "image-registry-tls" not found Apr 23 17:55:50.736949 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:50.736659 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert podName:676e7d7a-e558-49c3-bc63-788e4d3f9a19 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:51.736654195 +0000 UTC m=+196.552019820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert") pod "ingress-canary-qnq92" (UID: "676e7d7a-e558-49c3-bc63-788e4d3f9a19") : secret "canary-serving-cert" not found Apr 23 17:55:51.180473 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:51.180441 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" event={"ID":"c4adb4da-c47c-42d2-ba86-f2c0503ec968","Type":"ContainerStarted","Data":"ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c"} Apr 23 17:55:51.180473 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:51.180477 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" event={"ID":"c4adb4da-c47c-42d2-ba86-f2c0503ec968","Type":"ContainerStarted","Data":"b10c5c6d3f3352a4b39aae65b0ec2f70ce2fd59f06974983aa826a2b614ed832"} Apr 23 17:55:51.180767 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:51.180744 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:51.191643 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:51.191621 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:55:51.202525 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:51.202475 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" podStartSLOduration=1.20245851 podStartE2EDuration="1.20245851s" podCreationTimestamp="2026-04-23 17:55:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:55:51.202237611 +0000 UTC m=+196.017603260" watchObservedRunningTime="2026-04-23 17:55:51.20245851 +0000 UTC m=+196.017824163" Apr 23 17:55:51.742902 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:51.742647 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:51.743283 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:51.742904 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:51.743283 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:51.742794 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:55:51.743283 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:51.742943 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:55:51.743283 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:51.742948 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6867cfcb6c-chk7c: secret "image-registry-tls" not found Apr 23 17:55:51.743283 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:51.742987 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:55:51.743283 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:51.743032 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls podName:af20cd75-f1bb-4dcd-b179-14bcb34e5ef1 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:53.743017001 +0000 UTC m=+198.558382627 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls") pod "dns-default-nq87q" (UID: "af20cd75-f1bb-4dcd-b179-14bcb34e5ef1") : secret "dns-default-metrics-tls" not found Apr 23 17:55:51.743283 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:51.743045 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls podName:af35427b-d504-4ded-b482-891528bafad3 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:53.743038649 +0000 UTC m=+198.558404275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls") pod "image-registry-6867cfcb6c-chk7c" (UID: "af35427b-d504-4ded-b482-891528bafad3") : secret "image-registry-tls" not found Apr 23 17:55:51.743283 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:51.743074 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:55:51.743283 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:51.743122 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert podName:676e7d7a-e558-49c3-bc63-788e4d3f9a19 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:53.743108705 +0000 UTC m=+198.558474331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert") pod "ingress-canary-qnq92" (UID: "676e7d7a-e558-49c3-bc63-788e4d3f9a19") : secret "canary-serving-cert" not found Apr 23 17:55:52.014358 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:52.014323 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-hfd28"] Apr 23 17:55:52.187253 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:52.187214 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qxm9s" event={"ID":"76db35ef-2b60-42ed-bb55-38ec1534f90f","Type":"ContainerStarted","Data":"4d9fa09257ab7db8b89e56afd20fe6b015889847a48a31b9c912fc35b8176271"} Apr 23 17:55:53.188877 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:53.188814 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" podUID="c4adb4da-c47c-42d2-ba86-f2c0503ec968" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" gracePeriod=30 Apr 23 17:55:53.189289 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:53.189077 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:55:53.759202 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:53.759164 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:53.759348 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:53.759215 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:53.759348 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:53.759309 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:55:53.759348 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:53.759318 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:55:53.759348 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:53.759338 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6867cfcb6c-chk7c: secret "image-registry-tls" not found Apr 23 17:55:53.759348 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:53.759339 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:55:53.759532 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:53.759355 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls podName:af20cd75-f1bb-4dcd-b179-14bcb34e5ef1 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:57.759342424 +0000 UTC m=+202.574708050 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls") pod "dns-default-nq87q" (UID: "af20cd75-f1bb-4dcd-b179-14bcb34e5ef1") : secret "dns-default-metrics-tls" not found Apr 23 17:55:53.759532 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:53.759379 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls podName:af35427b-d504-4ded-b482-891528bafad3 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:57.759367635 +0000 UTC m=+202.574733261 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls") pod "image-registry-6867cfcb6c-chk7c" (UID: "af35427b-d504-4ded-b482-891528bafad3") : secret "image-registry-tls" not found Apr 23 17:55:53.759532 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:53.759402 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:55:53.759532 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:53.759448 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert podName:676e7d7a-e558-49c3-bc63-788e4d3f9a19 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:57.759435271 +0000 UTC m=+202.574800905 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert") pod "ingress-canary-qnq92" (UID: "676e7d7a-e558-49c3-bc63-788e4d3f9a19") : secret "canary-serving-cert" not found Apr 23 17:55:57.783753 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:57.783712 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:55:57.783753 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:57.783760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:55:57.784270 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:57.783873 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:55:57.784270 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:57.783887 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:55:57.784270 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:57.783920 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert podName:676e7d7a-e558-49c3-bc63-788e4d3f9a19 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:05.783905299 +0000 UTC m=+210.599270925 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert") pod "ingress-canary-qnq92" (UID: "676e7d7a-e558-49c3-bc63-788e4d3f9a19") : secret "canary-serving-cert" not found Apr 23 17:55:57.784270 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:57.783952 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls podName:af20cd75-f1bb-4dcd-b179-14bcb34e5ef1 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:05.783934662 +0000 UTC m=+210.599300293 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls") pod "dns-default-nq87q" (UID: "af20cd75-f1bb-4dcd-b179-14bcb34e5ef1") : secret "dns-default-metrics-tls" not found Apr 23 17:55:57.784270 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:57.783947 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:55:57.784270 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:57.784003 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:55:57.784270 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:57.784017 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6867cfcb6c-chk7c: secret "image-registry-tls" not found Apr 23 17:55:57.784270 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:57.784052 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls podName:af35427b-d504-4ded-b482-891528bafad3 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:05.784042487 +0000 UTC m=+210.599408113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls") pod "image-registry-6867cfcb6c-chk7c" (UID: "af35427b-d504-4ded-b482-891528bafad3") : secret "image-registry-tls" not found Apr 23 17:55:59.399208 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.399178 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c9dz6_47e47db9-35db-473d-b6e8-181ec486420c/dns-node-resolver/0.log" Apr 23 17:55:59.549378 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.549320 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qxm9s" podStartSLOduration=40.854587317 podStartE2EDuration="43.549301322s" podCreationTimestamp="2026-04-23 17:55:16 +0000 UTC" firstStartedPulling="2026-04-23 17:55:48.745855016 +0000 UTC m=+193.561220643" lastFinishedPulling="2026-04-23 17:55:51.440569018 +0000 UTC m=+196.255934648" observedRunningTime="2026-04-23 17:55:52.202749028 +0000 UTC m=+197.018114677" watchObservedRunningTime="2026-04-23 17:55:59.549301322 +0000 UTC m=+204.364666972" Apr 23 17:55:59.550116 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.550099 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-q9nhf"] Apr 23 17:55:59.562354 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.562328 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.564282 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.564256 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q9nhf"] Apr 23 17:55:59.565347 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.565324 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5jzrb\"" Apr 23 17:55:59.565347 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.565343 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 17:55:59.565514 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.565503 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 17:55:59.565569 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.565520 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 17:55:59.565569 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.565519 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 17:55:59.598197 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.598170 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/96554462-85fb-43e0-998f-5e9898c338b8-crio-socket\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.598197 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.598199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9pgw\" (UniqueName: \"kubernetes.io/projected/96554462-85fb-43e0-998f-5e9898c338b8-kube-api-access-s9pgw\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.598362 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.598234 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/96554462-85fb-43e0-998f-5e9898c338b8-data-volume\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.598362 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.598295 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/96554462-85fb-43e0-998f-5e9898c338b8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.598362 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.598313 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.698947 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.698880 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/96554462-85fb-43e0-998f-5e9898c338b8-crio-socket\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.698947 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.698910 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9pgw\" (UniqueName: \"kubernetes.io/projected/96554462-85fb-43e0-998f-5e9898c338b8-kube-api-access-s9pgw\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.698947 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.698937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/96554462-85fb-43e0-998f-5e9898c338b8-data-volume\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.699112 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.698970 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/96554462-85fb-43e0-998f-5e9898c338b8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.699112 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.698997 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.699112 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:59.699091 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 17:55:59.699256 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.699104 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/96554462-85fb-43e0-998f-5e9898c338b8-crio-socket\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.699256 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:55:59.699156 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls podName:96554462-85fb-43e0-998f-5e9898c338b8 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:00.199136794 +0000 UTC m=+205.014502420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-q9nhf" (UID: "96554462-85fb-43e0-998f-5e9898c338b8") : secret "insights-runtime-extractor-tls" not found Apr 23 17:55:59.699340 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.699315 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/96554462-85fb-43e0-998f-5e9898c338b8-data-volume\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.699509 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.699493 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/96554462-85fb-43e0-998f-5e9898c338b8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:55:59.712583 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:55:59.712557 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9pgw\" (UniqueName: \"kubernetes.io/projected/96554462-85fb-43e0-998f-5e9898c338b8-kube-api-access-s9pgw\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:56:00.203858 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:00.203812 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:56:00.204017 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:00.203943 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 17:56:00.204061 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:00.204026 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls podName:96554462-85fb-43e0-998f-5e9898c338b8 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:01.204010316 +0000 UTC m=+206.019375942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-q9nhf" (UID: "96554462-85fb-43e0-998f-5e9898c338b8") : secret "insights-runtime-extractor-tls" not found Apr 23 17:56:00.795649 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:00.795622 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jbbn4_b6c26cb5-e282-4840-a6ae-c60523f49733/node-ca/0.log" Apr 23 17:56:01.182920 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:01.182807 2579 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 23 17:56:01.183744 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:01.183714 2579 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 23 17:56:01.184585 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:01.184558 2579 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 23 17:56:01.184661 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:01.184594 2579 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" podUID="c4adb4da-c47c-42d2-ba86-f2c0503ec968" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Apr 23 17:56:01.211657 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:01.211632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:56:01.211787 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:01.211772 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 17:56:01.211847 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:01.211841 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls podName:96554462-85fb-43e0-998f-5e9898c338b8 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:03.211811643 +0000 UTC m=+208.027177269 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-q9nhf" (UID: "96554462-85fb-43e0-998f-5e9898c338b8") : secret "insights-runtime-extractor-tls" not found Apr 23 17:56:03.225860 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:03.225809 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:56:03.226431 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:03.225978 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 17:56:03.226431 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:03.226059 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls podName:96554462-85fb-43e0-998f-5e9898c338b8 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:07.226036343 +0000 UTC m=+212.041401982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-q9nhf" (UID: "96554462-85fb-43e0-998f-5e9898c338b8") : secret "insights-runtime-extractor-tls" not found Apr 23 17:56:05.845638 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:05.845607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:56:05.846097 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:05.845655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:56:05.846097 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:05.845690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:56:05.846097 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:05.845766 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:56:05.846097 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:05.845810 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:56:05.846097 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:05.845848 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls podName:af20cd75-f1bb-4dcd-b179-14bcb34e5ef1 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:21.845815871 +0000 UTC m=+226.661181496 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls") pod "dns-default-nq87q" (UID: "af20cd75-f1bb-4dcd-b179-14bcb34e5ef1") : secret "dns-default-metrics-tls" not found Apr 23 17:56:05.846097 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:05.845770 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:56:05.846097 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:05.845868 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6867cfcb6c-chk7c: secret "image-registry-tls" not found Apr 23 17:56:05.846097 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:05.845874 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert podName:676e7d7a-e558-49c3-bc63-788e4d3f9a19 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:21.845859855 +0000 UTC m=+226.661225480 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert") pod "ingress-canary-qnq92" (UID: "676e7d7a-e558-49c3-bc63-788e4d3f9a19") : secret "canary-serving-cert" not found Apr 23 17:56:05.846097 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:05.845911 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls podName:af35427b-d504-4ded-b482-891528bafad3 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:21.845897939 +0000 UTC m=+226.661263567 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls") pod "image-registry-6867cfcb6c-chk7c" (UID: "af35427b-d504-4ded-b482-891528bafad3") : secret "image-registry-tls" not found Apr 23 17:56:07.257412 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:07.257378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:56:07.257859 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:07.257536 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 23 17:56:07.257859 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:07.257608 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls podName:96554462-85fb-43e0-998f-5e9898c338b8 nodeName:}" failed. No retries permitted until 2026-04-23 17:56:15.257592008 +0000 UTC m=+220.072957638 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls") pod "insights-runtime-extractor-q9nhf" (UID: "96554462-85fb-43e0-998f-5e9898c338b8") : secret "insights-runtime-extractor-tls" not found Apr 23 17:56:07.660466 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:07.660388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:56:07.660597 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:07.660501 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:56:07.660597 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:07.660551 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs podName:67c4e3ae-cc88-433d-8549-c77153e2e1d6 nodeName:}" failed. No retries permitted until 2026-04-23 17:57:11.660539026 +0000 UTC m=+276.475904652 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs") pod "network-metrics-daemon-gnxs9" (UID: "67c4e3ae-cc88-433d-8549-c77153e2e1d6") : secret "metrics-daemon-secret" not found Apr 23 17:56:11.182425 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:11.182379 2579 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 23 17:56:11.183317 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:11.183284 2579 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 23 17:56:11.184174 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:11.184141 2579 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 23 17:56:11.184259 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:11.184183 2579 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" podUID="c4adb4da-c47c-42d2-ba86-f2c0503ec968" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Apr 23 17:56:15.314377 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:15.314338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:56:15.316983 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:15.316960 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/96554462-85fb-43e0-998f-5e9898c338b8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q9nhf\" (UID: \"96554462-85fb-43e0-998f-5e9898c338b8\") " pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:56:15.471541 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:15.471504 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q9nhf" Apr 23 17:56:15.596447 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:15.596370 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q9nhf"] Apr 23 17:56:15.600479 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:15.600439 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96554462_85fb_43e0_998f_5e9898c338b8.slice/crio-50c1d676e2928407f89a096381741ac83e1f2809ac4c7f99373eebbe31301254 WatchSource:0}: Error finding container 50c1d676e2928407f89a096381741ac83e1f2809ac4c7f99373eebbe31301254: Status 404 returned error can't find the container with id 50c1d676e2928407f89a096381741ac83e1f2809ac4c7f99373eebbe31301254 Apr 23 17:56:16.232195 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:16.232151 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q9nhf" event={"ID":"96554462-85fb-43e0-998f-5e9898c338b8","Type":"ContainerStarted","Data":"e87f33234fe5164673d1ebe43c3c737508b1058e588d082c2df21247104513be"} Apr 23 17:56:16.232195 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:16.232188 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q9nhf" event={"ID":"96554462-85fb-43e0-998f-5e9898c338b8","Type":"ContainerStarted","Data":"2f64ddd25eace214fbe31617def80e61a76cb084245f7e403de6ea380a3168ab"} Apr 23 17:56:16.232195 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:16.232198 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q9nhf" event={"ID":"96554462-85fb-43e0-998f-5e9898c338b8","Type":"ContainerStarted","Data":"50c1d676e2928407f89a096381741ac83e1f2809ac4c7f99373eebbe31301254"} Apr 23 17:56:18.181033 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:18.180998 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9sxcg" Apr 23 17:56:18.238305 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:18.238268 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q9nhf" event={"ID":"96554462-85fb-43e0-998f-5e9898c338b8","Type":"ContainerStarted","Data":"50e740f4a6950bda44916a8b640d329eb923a9b7fdb40520846f51531f5c2dad"} Apr 23 17:56:18.253814 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:18.253763 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-q9nhf" podStartSLOduration=17.391245718 podStartE2EDuration="19.253746272s" podCreationTimestamp="2026-04-23 17:55:59 +0000 UTC" firstStartedPulling="2026-04-23 17:56:15.648641543 +0000 UTC m=+220.464007174" lastFinishedPulling="2026-04-23 17:56:17.511142096 +0000 UTC m=+222.326507728" observedRunningTime="2026-04-23 17:56:18.25344584 +0000 UTC m=+223.068811488" watchObservedRunningTime="2026-04-23 17:56:18.253746272 +0000 UTC m=+223.069111920" Apr 23 17:56:21.182335 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:21.182294 2579 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 23 17:56:21.183218 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:21.183189 2579 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 23 17:56:21.184024 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:21.184002 2579 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" cmd=["/bin/bash","-c","test -f /ready/ready"] Apr 23 17:56:21.184096 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:21.184035 2579 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" podUID="c4adb4da-c47c-42d2-ba86-f2c0503ec968" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Apr 23 17:56:21.866057 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:21.866023 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:56:21.866267 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:21.866074 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:56:21.866267 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:21.866211 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:56:21.868493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:21.868464 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af20cd75-f1bb-4dcd-b179-14bcb34e5ef1-metrics-tls\") pod \"dns-default-nq87q\" (UID: \"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1\") " pod="openshift-dns/dns-default-nq87q" Apr 23 17:56:21.868732 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:21.868711 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls\") pod \"image-registry-6867cfcb6c-chk7c\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:56:21.868773 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:21.868711 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/676e7d7a-e558-49c3-bc63-788e4d3f9a19-cert\") pod \"ingress-canary-qnq92\" (UID: \"676e7d7a-e558-49c3-bc63-788e4d3f9a19\") " pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:56:21.884541 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:21.884512 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nq87q" Apr 23 17:56:21.907864 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:21.907802 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qnq92" Apr 23 17:56:22.034711 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:22.034688 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nq87q"] Apr 23 17:56:22.036687 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:22.036660 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf20cd75_f1bb_4dcd_b179_14bcb34e5ef1.slice/crio-5ee65e4bd6dba656e00364f39484954df8e95022fcb1484e4e5f990a36609cd4 WatchSource:0}: Error finding container 5ee65e4bd6dba656e00364f39484954df8e95022fcb1484e4e5f990a36609cd4: Status 404 returned error can't find the container with id 5ee65e4bd6dba656e00364f39484954df8e95022fcb1484e4e5f990a36609cd4 Apr 23 17:56:22.052167 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:22.052143 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qnq92"] Apr 23 17:56:22.054983 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:22.054961 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod676e7d7a_e558_49c3_bc63_788e4d3f9a19.slice/crio-adc34f8ac53869bbd4c9f9b0520fdca0463ed52895394969c7b9bbd0218fabb6 WatchSource:0}: Error finding container adc34f8ac53869bbd4c9f9b0520fdca0463ed52895394969c7b9bbd0218fabb6: Status 404 returned error can't find the container with id adc34f8ac53869bbd4c9f9b0520fdca0463ed52895394969c7b9bbd0218fabb6 Apr 23 17:56:22.168261 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:22.168183 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:56:22.245939 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:22.245904 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nq87q" event={"ID":"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1","Type":"ContainerStarted","Data":"5ee65e4bd6dba656e00364f39484954df8e95022fcb1484e4e5f990a36609cd4"} Apr 23 17:56:22.246995 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:22.246971 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qnq92" event={"ID":"676e7d7a-e558-49c3-bc63-788e4d3f9a19","Type":"ContainerStarted","Data":"adc34f8ac53869bbd4c9f9b0520fdca0463ed52895394969c7b9bbd0218fabb6"} Apr 23 17:56:22.296645 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:22.296621 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6867cfcb6c-chk7c"] Apr 23 17:56:22.298406 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:22.298382 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf35427b_d504_4ded_b482_891528bafad3.slice/crio-cd69ba809c9bec650b15f96e7ec1cf7a665031c8338160915412d28ecced8e28 WatchSource:0}: Error finding container cd69ba809c9bec650b15f96e7ec1cf7a665031c8338160915412d28ecced8e28: Status 404 returned error can't find the container with id cd69ba809c9bec650b15f96e7ec1cf7a665031c8338160915412d28ecced8e28 Apr 23 17:56:23.250842 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:23.250791 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" event={"ID":"af35427b-d504-4ded-b482-891528bafad3","Type":"ContainerStarted","Data":"f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094"} Apr 23 17:56:23.251230 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:23.250847 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" event={"ID":"af35427b-d504-4ded-b482-891528bafad3","Type":"ContainerStarted","Data":"cd69ba809c9bec650b15f96e7ec1cf7a665031c8338160915412d28ecced8e28"} Apr 23 17:56:23.251230 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:23.250951 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:56:23.285326 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:23.285273 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" podStartSLOduration=88.285257402 podStartE2EDuration="1m28.285257402s" podCreationTimestamp="2026-04-23 17:54:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:56:23.271185854 +0000 UTC m=+228.086551515" watchObservedRunningTime="2026-04-23 17:56:23.285257402 +0000 UTC m=+228.100623049" Apr 23 17:56:24.005564 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.005540 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-hfd28_c4adb4da-c47c-42d2-ba86-f2c0503ec968/kube-multus-additional-cni-plugins/0.log" Apr 23 17:56:24.005686 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.005597 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:56:24.086044 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.086022 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c4adb4da-c47c-42d2-ba86-f2c0503ec968-ready\") pod \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " Apr 23 17:56:24.086179 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.086056 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4adb4da-c47c-42d2-ba86-f2c0503ec968-cni-sysctl-allowlist\") pod \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " Apr 23 17:56:24.086179 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.086086 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4adb4da-c47c-42d2-ba86-f2c0503ec968-tuning-conf-dir\") pod \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " Apr 23 17:56:24.086179 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.086118 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9rc6\" (UniqueName: \"kubernetes.io/projected/c4adb4da-c47c-42d2-ba86-f2c0503ec968-kube-api-access-p9rc6\") pod \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\" (UID: \"c4adb4da-c47c-42d2-ba86-f2c0503ec968\") " Apr 23 17:56:24.086339 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.086216 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4adb4da-c47c-42d2-ba86-f2c0503ec968-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "c4adb4da-c47c-42d2-ba86-f2c0503ec968" (UID: "c4adb4da-c47c-42d2-ba86-f2c0503ec968"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 23 17:56:24.086339 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.086310 2579 reconciler_common.go:299] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c4adb4da-c47c-42d2-ba86-f2c0503ec968-tuning-conf-dir\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:56:24.086339 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.086321 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4adb4da-c47c-42d2-ba86-f2c0503ec968-ready" (OuterVolumeSpecName: "ready") pod "c4adb4da-c47c-42d2-ba86-f2c0503ec968" (UID: "c4adb4da-c47c-42d2-ba86-f2c0503ec968"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:56:24.086493 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.086412 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4adb4da-c47c-42d2-ba86-f2c0503ec968-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "c4adb4da-c47c-42d2-ba86-f2c0503ec968" (UID: "c4adb4da-c47c-42d2-ba86-f2c0503ec968"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:24.089036 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.089009 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4adb4da-c47c-42d2-ba86-f2c0503ec968-kube-api-access-p9rc6" (OuterVolumeSpecName: "kube-api-access-p9rc6") pod "c4adb4da-c47c-42d2-ba86-f2c0503ec968" (UID: "c4adb4da-c47c-42d2-ba86-f2c0503ec968"). InnerVolumeSpecName "kube-api-access-p9rc6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:56:24.187392 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.187303 2579 reconciler_common.go:299] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/c4adb4da-c47c-42d2-ba86-f2c0503ec968-ready\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:56:24.187392 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.187337 2579 reconciler_common.go:299] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c4adb4da-c47c-42d2-ba86-f2c0503ec968-cni-sysctl-allowlist\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:56:24.187392 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.187355 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p9rc6\" (UniqueName: \"kubernetes.io/projected/c4adb4da-c47c-42d2-ba86-f2c0503ec968-kube-api-access-p9rc6\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:56:24.193109 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.193085 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qxm9s" Apr 23 17:56:24.254970 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.254952 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-hfd28_c4adb4da-c47c-42d2-ba86-f2c0503ec968/kube-multus-additional-cni-plugins/0.log" Apr 23 17:56:24.255323 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.254986 2579 generic.go:358] "Generic (PLEG): container finished" podID="c4adb4da-c47c-42d2-ba86-f2c0503ec968" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" exitCode=137 Apr 23 17:56:24.255323 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.255042 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" event={"ID":"c4adb4da-c47c-42d2-ba86-f2c0503ec968","Type":"ContainerDied","Data":"ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c"} Apr 23 17:56:24.255323 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.255063 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" event={"ID":"c4adb4da-c47c-42d2-ba86-f2c0503ec968","Type":"ContainerDied","Data":"b10c5c6d3f3352a4b39aae65b0ec2f70ce2fd59f06974983aa826a2b614ed832"} Apr 23 17:56:24.255323 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.255077 2579 scope.go:117] "RemoveContainer" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" Apr 23 17:56:24.255323 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.255081 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-hfd28" Apr 23 17:56:24.260619 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.260565 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qnq92" event={"ID":"676e7d7a-e558-49c3-bc63-788e4d3f9a19","Type":"ContainerStarted","Data":"c1e88bab0cf51852506c865d32317bedab5f576bbb9191702eeafc521a40e97e"} Apr 23 17:56:24.262431 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.262404 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nq87q" event={"ID":"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1","Type":"ContainerStarted","Data":"d0683fba0fe43710d462e48acef0b4da4b23d34751809898111a0e62f4dc23bd"} Apr 23 17:56:24.266961 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.266941 2579 scope.go:117] "RemoveContainer" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" Apr 23 17:56:24.267224 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:24.267199 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c\": container with ID starting with ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c not found: ID does not exist" containerID="ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c" Apr 23 17:56:24.267316 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.267232 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c"} err="failed to get container status \"ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c\": rpc error: code = NotFound desc = could not find container \"ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c\": container with ID starting with ca9d0a3dd21f893c57dc1f010b80544ebfec058d8fa465bd7f2d5f455465222c not found: ID does not exist" Apr 23 17:56:24.277053 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.277006 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qnq92" podStartSLOduration=32.302776516 podStartE2EDuration="34.276990918s" podCreationTimestamp="2026-04-23 17:55:50 +0000 UTC" firstStartedPulling="2026-04-23 17:56:22.056562801 +0000 UTC m=+226.871928430" lastFinishedPulling="2026-04-23 17:56:24.030777189 +0000 UTC m=+228.846142832" observedRunningTime="2026-04-23 17:56:24.276332797 +0000 UTC m=+229.091698446" watchObservedRunningTime="2026-04-23 17:56:24.276990918 +0000 UTC m=+229.092356567" Apr 23 17:56:24.293277 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.293251 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-hfd28"] Apr 23 17:56:24.297057 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.297028 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-hfd28"] Apr 23 17:56:24.594167 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.594094 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc"] Apr 23 17:56:24.594313 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.594302 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4adb4da-c47c-42d2-ba86-f2c0503ec968" containerName="kube-multus-additional-cni-plugins" Apr 23 17:56:24.594363 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.594315 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4adb4da-c47c-42d2-ba86-f2c0503ec968" containerName="kube-multus-additional-cni-plugins" Apr 23 17:56:24.594401 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.594364 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4adb4da-c47c-42d2-ba86-f2c0503ec968" containerName="kube-multus-additional-cni-plugins" Apr 23 17:56:24.598168 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.598153 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc" Apr 23 17:56:24.601007 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.600982 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-l84qg\"" Apr 23 17:56:24.601130 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.601038 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 17:56:24.604899 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.604874 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc"] Apr 23 17:56:24.691635 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.691591 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/49a66fbd-c07c-463a-afd9-fd6e00f9fbbb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t6rbc\" (UID: \"49a66fbd-c07c-463a-afd9-fd6e00f9fbbb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc" Apr 23 17:56:24.792926 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:24.792897 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/49a66fbd-c07c-463a-afd9-fd6e00f9fbbb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t6rbc\" (UID: \"49a66fbd-c07c-463a-afd9-fd6e00f9fbbb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc" Apr 23 17:56:24.793034 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:24.793018 2579 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 23 17:56:24.793086 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:24.793076 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49a66fbd-c07c-463a-afd9-fd6e00f9fbbb-tls-certificates podName:49a66fbd-c07c-463a-afd9-fd6e00f9fbbb nodeName:}" failed. No retries permitted until 2026-04-23 17:56:25.293062202 +0000 UTC m=+230.108427827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/49a66fbd-c07c-463a-afd9-fd6e00f9fbbb-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-t6rbc" (UID: "49a66fbd-c07c-463a-afd9-fd6e00f9fbbb") : secret "prometheus-operator-admission-webhook-tls" not found Apr 23 17:56:25.269505 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:25.269471 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nq87q" event={"ID":"af20cd75-f1bb-4dcd-b179-14bcb34e5ef1","Type":"ContainerStarted","Data":"69f490ac63dc036c1badacb16e1e0860e6da20388b03b67f1c9250d2f0e79d3a"} Apr 23 17:56:25.289469 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:25.289424 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nq87q" podStartSLOduration=33.300349806 podStartE2EDuration="35.289410258s" podCreationTimestamp="2026-04-23 17:55:50 +0000 UTC" firstStartedPulling="2026-04-23 17:56:22.038478285 +0000 UTC m=+226.853843912" lastFinishedPulling="2026-04-23 17:56:24.027538739 +0000 UTC m=+228.842904364" observedRunningTime="2026-04-23 17:56:25.288547519 +0000 UTC m=+230.103913203" watchObservedRunningTime="2026-04-23 17:56:25.289410258 +0000 UTC m=+230.104775906" Apr 23 17:56:25.296352 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:25.296325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/49a66fbd-c07c-463a-afd9-fd6e00f9fbbb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t6rbc\" (UID: \"49a66fbd-c07c-463a-afd9-fd6e00f9fbbb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc" Apr 23 17:56:25.298841 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:25.298807 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/49a66fbd-c07c-463a-afd9-fd6e00f9fbbb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-t6rbc\" (UID: \"49a66fbd-c07c-463a-afd9-fd6e00f9fbbb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc" Apr 23 17:56:25.507167 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:25.507139 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc" Apr 23 17:56:25.620842 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:25.620787 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc"] Apr 23 17:56:25.623479 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:25.623450 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49a66fbd_c07c_463a_afd9_fd6e00f9fbbb.slice/crio-d8ef627662bd3592dcbef6c0cc6bb4ca10168e327a255479aa069617d3ebdd42 WatchSource:0}: Error finding container d8ef627662bd3592dcbef6c0cc6bb4ca10168e327a255479aa069617d3ebdd42: Status 404 returned error can't find the container with id d8ef627662bd3592dcbef6c0cc6bb4ca10168e327a255479aa069617d3ebdd42 Apr 23 17:56:25.801175 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:25.801143 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4adb4da-c47c-42d2-ba86-f2c0503ec968" path="/var/lib/kubelet/pods/c4adb4da-c47c-42d2-ba86-f2c0503ec968/volumes" Apr 23 17:56:26.273922 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:26.273858 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc" event={"ID":"49a66fbd-c07c-463a-afd9-fd6e00f9fbbb","Type":"ContainerStarted","Data":"d8ef627662bd3592dcbef6c0cc6bb4ca10168e327a255479aa069617d3ebdd42"} Apr 23 17:56:26.274341 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:26.274122 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nq87q" Apr 23 17:56:27.280732 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.280693 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc" event={"ID":"49a66fbd-c07c-463a-afd9-fd6e00f9fbbb","Type":"ContainerStarted","Data":"5ca6c15a0ab010ed4ccc9bb1e1e9c1880fa4d7b7c015828bfe243b9c5e70863c"} Apr 23 17:56:27.281170 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.280873 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc" Apr 23 17:56:27.285383 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.285360 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc" Apr 23 17:56:27.295188 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.295151 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-t6rbc" podStartSLOduration=2.433669319 podStartE2EDuration="3.295139474s" podCreationTimestamp="2026-04-23 17:56:24 +0000 UTC" firstStartedPulling="2026-04-23 17:56:25.625326398 +0000 UTC m=+230.440692024" lastFinishedPulling="2026-04-23 17:56:26.486796548 +0000 UTC m=+231.302162179" observedRunningTime="2026-04-23 17:56:27.294737566 +0000 UTC m=+232.110103214" watchObservedRunningTime="2026-04-23 17:56:27.295139474 +0000 UTC m=+232.110505162" Apr 23 17:56:27.670914 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.670810 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jtmbp"] Apr 23 17:56:27.675549 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.675533 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.679579 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.679557 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 17:56:27.679773 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.679577 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 17:56:27.679773 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.679588 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 17:56:27.679773 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.679603 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 17:56:27.679773 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.679670 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-gs5ck\"" Apr 23 17:56:27.679773 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.679557 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 17:56:27.681454 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.681427 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jtmbp"] Apr 23 17:56:27.814326 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.814292 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0448d038-a900-4370-b832-441691c982c3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.814326 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.814322 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0448d038-a900-4370-b832-441691c982c3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.814538 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.814344 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0448d038-a900-4370-b832-441691c982c3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.814538 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.814449 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk5jj\" (UniqueName: \"kubernetes.io/projected/0448d038-a900-4370-b832-441691c982c3-kube-api-access-hk5jj\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.915539 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.915485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk5jj\" (UniqueName: \"kubernetes.io/projected/0448d038-a900-4370-b832-441691c982c3-kube-api-access-hk5jj\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.915696 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.915559 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0448d038-a900-4370-b832-441691c982c3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.915696 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.915583 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0448d038-a900-4370-b832-441691c982c3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.915696 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.915604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0448d038-a900-4370-b832-441691c982c3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.916271 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.916246 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0448d038-a900-4370-b832-441691c982c3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.918226 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.918205 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0448d038-a900-4370-b832-441691c982c3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.918298 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.918280 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0448d038-a900-4370-b832-441691c982c3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.924961 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.924906 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk5jj\" (UniqueName: \"kubernetes.io/projected/0448d038-a900-4370-b832-441691c982c3-kube-api-access-hk5jj\") pod \"prometheus-operator-5676c8c784-jtmbp\" (UID: \"0448d038-a900-4370-b832-441691c982c3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:27.985881 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:27.985851 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" Apr 23 17:56:28.100906 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:28.100873 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jtmbp"] Apr 23 17:56:28.103454 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:28.103427 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0448d038_a900_4370_b832_441691c982c3.slice/crio-1f9ee5588e0739c8466553346158267f0a981960cf23b6de6a4124164894a0ac WatchSource:0}: Error finding container 1f9ee5588e0739c8466553346158267f0a981960cf23b6de6a4124164894a0ac: Status 404 returned error can't find the container with id 1f9ee5588e0739c8466553346158267f0a981960cf23b6de6a4124164894a0ac Apr 23 17:56:28.284489 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:28.284451 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" event={"ID":"0448d038-a900-4370-b832-441691c982c3","Type":"ContainerStarted","Data":"1f9ee5588e0739c8466553346158267f0a981960cf23b6de6a4124164894a0ac"} Apr 23 17:56:29.288444 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:29.288408 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" event={"ID":"0448d038-a900-4370-b832-441691c982c3","Type":"ContainerStarted","Data":"cccad2dc69d77965b664cc2485406486f2e0040a0eed59f41ad6261cec0967fc"} Apr 23 17:56:30.292283 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:30.292248 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" event={"ID":"0448d038-a900-4370-b832-441691c982c3","Type":"ContainerStarted","Data":"09ad532b04b7d9b0df16a5d21632a648b75376d91f80b05d3d9593e1a567fbad"} Apr 23 17:56:30.307168 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:30.307118 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-jtmbp" podStartSLOduration=2.187107257 podStartE2EDuration="3.307106478s" podCreationTimestamp="2026-04-23 17:56:27 +0000 UTC" firstStartedPulling="2026-04-23 17:56:28.105303385 +0000 UTC m=+232.920669011" lastFinishedPulling="2026-04-23 17:56:29.225302604 +0000 UTC m=+234.040668232" observedRunningTime="2026-04-23 17:56:30.306763997 +0000 UTC m=+235.122129646" watchObservedRunningTime="2026-04-23 17:56:30.307106478 +0000 UTC m=+235.122472126" Apr 23 17:56:32.018683 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.018649 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl"] Apr 23 17:56:32.021812 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.021795 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.024483 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.024460 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 17:56:32.024589 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.024488 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-97zl2\"" Apr 23 17:56:32.024589 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.024492 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 17:56:32.029560 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.029538 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl"] Apr 23 17:56:32.037705 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.037686 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4926f"] Apr 23 17:56:32.040519 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.040503 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.043220 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.043205 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 17:56:32.044124 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.044100 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 17:56:32.044252 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.044235 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 17:56:32.044319 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.044272 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qddvv\"" Apr 23 17:56:32.060500 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.060476 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-fz4sr"] Apr 23 17:56:32.063460 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.063445 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.067054 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.067036 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 17:56:32.067157 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.067037 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-bk2zs\"" Apr 23 17:56:32.067220 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.067162 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 17:56:32.067220 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.067212 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 17:56:32.073808 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.073789 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-fz4sr"] Apr 23 17:56:32.146006 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.145975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-tls\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.146006 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146007 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.146175 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146027 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.146175 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146057 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8df34d0-5aac-4603-b040-3396c2646e7a-sys\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.146175 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146111 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.146175 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.146292 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146180 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-textfile\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.146292 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146198 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.146292 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146214 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.146292 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146278 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.146405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146303 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6kpf\" (UniqueName: \"kubernetes.io/projected/a8df34d0-5aac-4603-b040-3396c2646e7a-kube-api-access-w6kpf\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.146405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146322 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-wtmp\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.146405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146339 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mk7f\" (UniqueName: \"kubernetes.io/projected/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-kube-api-access-5mk7f\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.146405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146353 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a8df34d0-5aac-4603-b040-3396c2646e7a-metrics-client-ca\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.146405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146371 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.146405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146394 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hknfd\" (UniqueName: \"kubernetes.io/projected/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-api-access-hknfd\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.146570 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146442 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a8df34d0-5aac-4603-b040-3396c2646e7a-root\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.146570 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146462 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.146570 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.146487 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-accelerators-collector-config\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.247520 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.247480 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.247694 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.247534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.247694 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.247567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-textfile\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.247694 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.247593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.247694 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.247615 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.248020 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.247993 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-textfile\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.248101 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248067 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.248169 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248101 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6kpf\" (UniqueName: \"kubernetes.io/projected/a8df34d0-5aac-4603-b040-3396c2646e7a-kube-api-access-w6kpf\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.248169 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248127 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-wtmp\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.248169 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248155 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mk7f\" (UniqueName: \"kubernetes.io/projected/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-kube-api-access-5mk7f\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.248335 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248177 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a8df34d0-5aac-4603-b040-3396c2646e7a-metrics-client-ca\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.248335 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248208 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.248335 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hknfd\" (UniqueName: \"kubernetes.io/projected/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-api-access-hknfd\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.248335 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248262 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a8df34d0-5aac-4603-b040-3396c2646e7a-root\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.248335 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.248580 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-accelerators-collector-config\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.248580 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248424 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.248580 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248467 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.248727 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.248727 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248709 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a8df34d0-5aac-4603-b040-3396c2646e7a-root\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.248853 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248751 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-wtmp\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.249011 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.248960 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a8df34d0-5aac-4603-b040-3396c2646e7a-metrics-client-ca\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.249129 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.249006 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-tls\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.249129 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.249049 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.249129 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.249087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.249274 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.249149 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8df34d0-5aac-4603-b040-3396c2646e7a-sys\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.249274 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:32.249263 2579 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 23 17:56:32.249377 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:56:32.249333 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-tls podName:1f3971d2-59c6-44bc-b53a-c1f78a8b154e nodeName:}" failed. No retries permitted until 2026-04-23 17:56:32.749313664 +0000 UTC m=+237.564679290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-fz4sr" (UID: "1f3971d2-59c6-44bc-b53a-c1f78a8b154e") : secret "kube-state-metrics-tls" not found Apr 23 17:56:32.249377 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.249343 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-accelerators-collector-config\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.249475 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.249363 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8df34d0-5aac-4603-b040-3396c2646e7a-sys\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.249693 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.249667 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.250693 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.250667 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.250693 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.250687 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.251505 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.251491 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.251656 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.251638 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.252063 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.252045 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a8df34d0-5aac-4603-b040-3396c2646e7a-node-exporter-tls\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.262213 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.262187 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hknfd\" (UniqueName: \"kubernetes.io/projected/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-api-access-hknfd\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.262315 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.262279 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6kpf\" (UniqueName: \"kubernetes.io/projected/a8df34d0-5aac-4603-b040-3396c2646e7a-kube-api-access-w6kpf\") pod \"node-exporter-4926f\" (UID: \"a8df34d0-5aac-4603-b040-3396c2646e7a\") " pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.262804 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.262785 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mk7f\" (UniqueName: \"kubernetes.io/projected/e11ff4c9-1509-4a31-a2de-6dd234e3cd0c-kube-api-access-5mk7f\") pod \"openshift-state-metrics-9d44df66c-hhctl\" (UID: \"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.330803 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.330727 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" Apr 23 17:56:32.348704 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.348681 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4926f" Apr 23 17:56:32.356613 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:32.356583 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8df34d0_5aac_4603_b040_3396c2646e7a.slice/crio-904cb1d4a2a133f385cd2da0d1b87831f04c72d20980b75ae27bcade022cd5bb WatchSource:0}: Error finding container 904cb1d4a2a133f385cd2da0d1b87831f04c72d20980b75ae27bcade022cd5bb: Status 404 returned error can't find the container with id 904cb1d4a2a133f385cd2da0d1b87831f04c72d20980b75ae27bcade022cd5bb Apr 23 17:56:32.447814 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.447787 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl"] Apr 23 17:56:32.450918 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:32.450880 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode11ff4c9_1509_4a31_a2de_6dd234e3cd0c.slice/crio-4d95cf2b6acb12e99333da057973dfeb83c0e8b0af6799ad30a497cf486f8006 WatchSource:0}: Error finding container 4d95cf2b6acb12e99333da057973dfeb83c0e8b0af6799ad30a497cf486f8006: Status 404 returned error can't find the container with id 4d95cf2b6acb12e99333da057973dfeb83c0e8b0af6799ad30a497cf486f8006 Apr 23 17:56:32.753243 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.753217 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.756615 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.756588 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1f3971d2-59c6-44bc-b53a-c1f78a8b154e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-fz4sr\" (UID: \"1f3971d2-59c6-44bc-b53a-c1f78a8b154e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:32.971374 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:32.971342 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" Apr 23 17:56:33.251244 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:33.248548 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-fz4sr"] Apr 23 17:56:33.254247 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:33.254218 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f3971d2_59c6_44bc_b53a_c1f78a8b154e.slice/crio-f9f183e920ed13788b85ef1657a2e6a5d9b9b3a4c8e3d25d538f0c7d96c5b729 WatchSource:0}: Error finding container f9f183e920ed13788b85ef1657a2e6a5d9b9b3a4c8e3d25d538f0c7d96c5b729: Status 404 returned error can't find the container with id f9f183e920ed13788b85ef1657a2e6a5d9b9b3a4c8e3d25d538f0c7d96c5b729 Apr 23 17:56:33.306947 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:33.306907 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" event={"ID":"1f3971d2-59c6-44bc-b53a-c1f78a8b154e","Type":"ContainerStarted","Data":"f9f183e920ed13788b85ef1657a2e6a5d9b9b3a4c8e3d25d538f0c7d96c5b729"} Apr 23 17:56:33.308898 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:33.308868 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" event={"ID":"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c","Type":"ContainerStarted","Data":"5eb75dffb03f3b903a43f14ac5da6c0329e70037a3605a8fb1ae0fbeafb769bf"} Apr 23 17:56:33.309024 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:33.308904 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" event={"ID":"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c","Type":"ContainerStarted","Data":"1230a77147b18b7fb369ac7c1ba1feddcda39a48af4d2f7fdaa43a1c5ee5662a"} Apr 23 17:56:33.309024 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:33.308919 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" event={"ID":"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c","Type":"ContainerStarted","Data":"4d95cf2b6acb12e99333da057973dfeb83c0e8b0af6799ad30a497cf486f8006"} Apr 23 17:56:33.310049 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:33.310025 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4926f" event={"ID":"a8df34d0-5aac-4603-b040-3396c2646e7a","Type":"ContainerStarted","Data":"528a305a1b2a5a9cbf461e43752210d9c12cf5a63af11539645acc0400880708"} Apr 23 17:56:33.310186 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:33.310055 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4926f" event={"ID":"a8df34d0-5aac-4603-b040-3396c2646e7a","Type":"ContainerStarted","Data":"904cb1d4a2a133f385cd2da0d1b87831f04c72d20980b75ae27bcade022cd5bb"} Apr 23 17:56:34.142072 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.141983 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-65b888866-skbbk"] Apr 23 17:56:34.145813 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.145791 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.149761 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.149719 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 17:56:34.150305 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.150284 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 17:56:34.150425 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.150334 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-ckg4m11o4au29\"" Apr 23 17:56:34.150615 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.150594 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 17:56:34.150717 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.150663 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 17:56:34.150717 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.150687 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-slkp5\"" Apr 23 17:56:34.151655 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.151626 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 17:56:34.159511 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.159491 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-65b888866-skbbk"] Apr 23 17:56:34.267190 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.267158 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-grpc-tls\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.267514 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.267196 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.267514 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.267216 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qw69\" (UniqueName: \"kubernetes.io/projected/8860c6ed-5eaa-426d-bf4c-421a54cda563-kube-api-access-2qw69\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.267514 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.267274 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-tls\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.267514 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.267325 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.267514 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.267353 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.267514 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.267395 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8860c6ed-5eaa-426d-bf4c-421a54cda563-metrics-client-ca\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.267514 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.267458 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.314491 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.314461 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" event={"ID":"e11ff4c9-1509-4a31-a2de-6dd234e3cd0c","Type":"ContainerStarted","Data":"e5ff3c96a5ffba0156ce52d54550d0e2c12c91528497b19d20b79ecc9b079e5b"} Apr 23 17:56:34.315895 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.315859 2579 generic.go:358] "Generic (PLEG): container finished" podID="a8df34d0-5aac-4603-b040-3396c2646e7a" containerID="528a305a1b2a5a9cbf461e43752210d9c12cf5a63af11539645acc0400880708" exitCode=0 Apr 23 17:56:34.315988 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.315934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4926f" event={"ID":"a8df34d0-5aac-4603-b040-3396c2646e7a","Type":"ContainerDied","Data":"528a305a1b2a5a9cbf461e43752210d9c12cf5a63af11539645acc0400880708"} Apr 23 17:56:34.335087 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.335048 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hhctl" podStartSLOduration=1.043591993 podStartE2EDuration="2.335033635s" podCreationTimestamp="2026-04-23 17:56:32 +0000 UTC" firstStartedPulling="2026-04-23 17:56:32.572924365 +0000 UTC m=+237.388289994" lastFinishedPulling="2026-04-23 17:56:33.86436601 +0000 UTC m=+238.679731636" observedRunningTime="2026-04-23 17:56:34.33402123 +0000 UTC m=+239.149386878" watchObservedRunningTime="2026-04-23 17:56:34.335033635 +0000 UTC m=+239.150399283" Apr 23 17:56:34.367984 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.367795 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8860c6ed-5eaa-426d-bf4c-421a54cda563-metrics-client-ca\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.367984 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.367886 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.367984 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.367951 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-grpc-tls\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.367984 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.367979 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.368404 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.368005 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qw69\" (UniqueName: \"kubernetes.io/projected/8860c6ed-5eaa-426d-bf4c-421a54cda563-kube-api-access-2qw69\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.368404 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.368032 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-tls\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.368594 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.368570 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8860c6ed-5eaa-426d-bf4c-421a54cda563-metrics-client-ca\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.368714 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.368699 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.368780 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.368742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.371558 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.371535 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.371643 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.371604 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.372077 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.372037 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.372150 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.372118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.372352 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.372332 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-grpc-tls\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.372792 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.372771 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8860c6ed-5eaa-426d-bf4c-421a54cda563-secret-thanos-querier-tls\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.384851 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.384790 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qw69\" (UniqueName: \"kubernetes.io/projected/8860c6ed-5eaa-426d-bf4c-421a54cda563-kube-api-access-2qw69\") pod \"thanos-querier-65b888866-skbbk\" (UID: \"8860c6ed-5eaa-426d-bf4c-421a54cda563\") " pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.457854 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.457319 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:34.628967 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:34.628916 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-65b888866-skbbk"] Apr 23 17:56:34.634007 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:34.633963 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8860c6ed_5eaa_426d_bf4c_421a54cda563.slice/crio-e59b4b37a312c6ba1c2e9a5643d11cf1e9e71296fa879d27a66edd3b744f4828 WatchSource:0}: Error finding container e59b4b37a312c6ba1c2e9a5643d11cf1e9e71296fa879d27a66edd3b744f4828: Status 404 returned error can't find the container with id e59b4b37a312c6ba1c2e9a5643d11cf1e9e71296fa879d27a66edd3b744f4828 Apr 23 17:56:35.320153 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:35.320105 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" event={"ID":"8860c6ed-5eaa-426d-bf4c-421a54cda563","Type":"ContainerStarted","Data":"e59b4b37a312c6ba1c2e9a5643d11cf1e9e71296fa879d27a66edd3b744f4828"} Apr 23 17:56:35.322522 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:35.322491 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" event={"ID":"1f3971d2-59c6-44bc-b53a-c1f78a8b154e","Type":"ContainerStarted","Data":"013b593b9da04d236bc763604cee859dee79b721f67ac2257518f71ffdebbe19"} Apr 23 17:56:35.322664 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:35.322529 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" event={"ID":"1f3971d2-59c6-44bc-b53a-c1f78a8b154e","Type":"ContainerStarted","Data":"090c066f65d8fb95cabf1a7660c561ecc7134c9c3b3869a02e7b4cd0ff3ea2d8"} Apr 23 17:56:35.322664 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:35.322544 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" event={"ID":"1f3971d2-59c6-44bc-b53a-c1f78a8b154e","Type":"ContainerStarted","Data":"60d39b78a8f663c845d6886e48644785106413a5db3615968b2e2697144e992c"} Apr 23 17:56:35.325289 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:35.325263 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4926f" event={"ID":"a8df34d0-5aac-4603-b040-3396c2646e7a","Type":"ContainerStarted","Data":"d57b374f6a0f9af475ee1c6e1bef253a6da83c04871d95a7eb0841415eadb2bc"} Apr 23 17:56:35.325451 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:35.325304 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4926f" event={"ID":"a8df34d0-5aac-4603-b040-3396c2646e7a","Type":"ContainerStarted","Data":"a3416420543d668eff1cec4439cbbcbcbc1684f04ac789de61f7ea21e0fa8acd"} Apr 23 17:56:35.353155 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:35.353101 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-fz4sr" podStartSLOduration=2.312064257 podStartE2EDuration="3.353085313s" podCreationTimestamp="2026-04-23 17:56:32 +0000 UTC" firstStartedPulling="2026-04-23 17:56:33.258089291 +0000 UTC m=+238.073454929" lastFinishedPulling="2026-04-23 17:56:34.299110359 +0000 UTC m=+239.114475985" observedRunningTime="2026-04-23 17:56:35.351729755 +0000 UTC m=+240.167095403" watchObservedRunningTime="2026-04-23 17:56:35.353085313 +0000 UTC m=+240.168450962" Apr 23 17:56:35.388011 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:35.387955 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4926f" podStartSLOduration=2.578968128 podStartE2EDuration="3.387938997s" podCreationTimestamp="2026-04-23 17:56:32 +0000 UTC" firstStartedPulling="2026-04-23 17:56:32.358431994 +0000 UTC m=+237.173797619" lastFinishedPulling="2026-04-23 17:56:33.167402846 +0000 UTC m=+237.982768488" observedRunningTime="2026-04-23 17:56:35.387787266 +0000 UTC m=+240.203152915" watchObservedRunningTime="2026-04-23 17:56:35.387938997 +0000 UTC m=+240.203304646" Apr 23 17:56:36.282984 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:36.282961 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nq87q" Apr 23 17:56:36.329677 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:36.329646 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" event={"ID":"8860c6ed-5eaa-426d-bf4c-421a54cda563","Type":"ContainerStarted","Data":"44b7a717b57aa13fc8e363d44f5b6ea1b51df45a60fba891bdd50a33dd09b9f4"} Apr 23 17:56:37.263539 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.263502 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g"] Apr 23 17:56:37.268915 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.268892 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.271777 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.271752 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 17:56:37.271921 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.271788 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 17:56:37.272163 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.272147 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 17:56:37.272359 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.272345 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-j9ps6\"" Apr 23 17:56:37.272478 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.272467 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 17:56:37.272594 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.272573 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 17:56:37.283528 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.283504 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g"] Apr 23 17:56:37.288671 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.288651 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 17:56:37.335594 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.335566 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" event={"ID":"8860c6ed-5eaa-426d-bf4c-421a54cda563","Type":"ContainerStarted","Data":"c63e36e16a0a53fd644f6d603e904be05a48293a60f89933a2b3bd31bb485c5a"} Apr 23 17:56:37.335594 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.335601 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" event={"ID":"8860c6ed-5eaa-426d-bf4c-421a54cda563","Type":"ContainerStarted","Data":"2e9cad370b16b82960fe6fa115a43cc66bc7e91c4ef49f0c966af35aac916e3f"} Apr 23 17:56:37.335992 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.335611 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" event={"ID":"8860c6ed-5eaa-426d-bf4c-421a54cda563","Type":"ContainerStarted","Data":"03b2209e4d93340446849bedf23b982f0764f394b4782b63f35078240b184bf5"} Apr 23 17:56:37.335992 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.335623 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" event={"ID":"8860c6ed-5eaa-426d-bf4c-421a54cda563","Type":"ContainerStarted","Data":"472165305a2e132b79787d63a8f8ac46f1bc15664c49f3416bb563722dd89091"} Apr 23 17:56:37.335992 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.335631 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" event={"ID":"8860c6ed-5eaa-426d-bf4c-421a54cda563","Type":"ContainerStarted","Data":"3f491241086929e661fb6928bc77da331788d99176233b1283aefe6146119a64"} Apr 23 17:56:37.335992 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.335795 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:37.359766 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.359718 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" podStartSLOduration=0.875174701 podStartE2EDuration="3.359705267s" podCreationTimestamp="2026-04-23 17:56:34 +0000 UTC" firstStartedPulling="2026-04-23 17:56:34.635985463 +0000 UTC m=+239.451351091" lastFinishedPulling="2026-04-23 17:56:37.120516031 +0000 UTC m=+241.935881657" observedRunningTime="2026-04-23 17:56:37.35940157 +0000 UTC m=+242.174767229" watchObservedRunningTime="2026-04-23 17:56:37.359705267 +0000 UTC m=+242.175070914" Apr 23 17:56:37.393751 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.393681 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.393751 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.393718 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-secret-telemeter-client\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.393751 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.393746 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1048dfda-10d2-4413-b065-476808e431a8-serving-certs-ca-bundle\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.393995 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.393773 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-telemeter-client-tls\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.393995 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.393797 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-federate-client-tls\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.393995 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.393818 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1048dfda-10d2-4413-b065-476808e431a8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.393995 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.393858 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grlzn\" (UniqueName: \"kubernetes.io/projected/1048dfda-10d2-4413-b065-476808e431a8-kube-api-access-grlzn\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.393995 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.393881 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1048dfda-10d2-4413-b065-476808e431a8-metrics-client-ca\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.495156 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.495115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1048dfda-10d2-4413-b065-476808e431a8-metrics-client-ca\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.495362 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.495331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.495433 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.495382 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-secret-telemeter-client\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.495491 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.495442 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1048dfda-10d2-4413-b065-476808e431a8-serving-certs-ca-bundle\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.495491 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.495479 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-telemeter-client-tls\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.495624 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.495508 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-federate-client-tls\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.495624 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.495535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1048dfda-10d2-4413-b065-476808e431a8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.495624 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.495570 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grlzn\" (UniqueName: \"kubernetes.io/projected/1048dfda-10d2-4413-b065-476808e431a8-kube-api-access-grlzn\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.496046 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.496020 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1048dfda-10d2-4413-b065-476808e431a8-metrics-client-ca\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.496926 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.496799 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1048dfda-10d2-4413-b065-476808e431a8-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.500519 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.498714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1048dfda-10d2-4413-b065-476808e431a8-serving-certs-ca-bundle\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.500519 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.499396 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.505898 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.501935 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-telemeter-client-tls\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.505898 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.504295 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grlzn\" (UniqueName: \"kubernetes.io/projected/1048dfda-10d2-4413-b065-476808e431a8-kube-api-access-grlzn\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.505898 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.505701 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-secret-telemeter-client\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.506502 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.506484 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1048dfda-10d2-4413-b065-476808e431a8-federate-client-tls\") pod \"telemeter-client-5dd6cfc8f5-9ll9g\" (UID: \"1048dfda-10d2-4413-b065-476808e431a8\") " pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.589889 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.589845 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" Apr 23 17:56:37.735062 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:37.735027 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g"] Apr 23 17:56:37.738860 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:37.738812 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1048dfda_10d2_4413_b065_476808e431a8.slice/crio-f9c4fb979aec90e386ffea4ce97dfa1091d16ec8db7fc8d2328cb6c16c684a38 WatchSource:0}: Error finding container f9c4fb979aec90e386ffea4ce97dfa1091d16ec8db7fc8d2328cb6c16c684a38: Status 404 returned error can't find the container with id f9c4fb979aec90e386ffea4ce97dfa1091d16ec8db7fc8d2328cb6c16c684a38 Apr 23 17:56:38.315064 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.315031 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:56:38.319494 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.319464 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.322139 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.322114 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 17:56:38.323307 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.322558 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 17:56:38.323307 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.322709 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rs2hc\"" Apr 23 17:56:38.323307 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.322815 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6blulatql74b7\"" Apr 23 17:56:38.323307 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.322848 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 17:56:38.323307 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.323044 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 17:56:38.323307 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.323069 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 17:56:38.325042 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.324788 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 17:56:38.325042 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.325011 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 17:56:38.325406 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.325379 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 17:56:38.325524 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.325438 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 17:56:38.325524 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.325471 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 17:56:38.326014 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.325607 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 17:56:38.326354 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.326066 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 17:56:38.327596 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.327572 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 17:56:38.332978 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.332956 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:56:38.342299 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.342267 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" event={"ID":"1048dfda-10d2-4413-b065-476808e431a8","Type":"ContainerStarted","Data":"f9c4fb979aec90e386ffea4ce97dfa1091d16ec8db7fc8d2328cb6c16c684a38"} Apr 23 17:56:38.404899 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.404848 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405066 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.404939 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405066 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.404980 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405066 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405066 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405038 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405284 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405192 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-config-out\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405284 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405273 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405374 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405298 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405374 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405325 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405374 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405353 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-config\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405463 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405405 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-web-config\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405463 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405425 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405463 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405453 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405576 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405481 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405576 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405497 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405576 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405576 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405560 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k99tt\" (UniqueName: \"kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-kube-api-access-k99tt\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.405748 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.405578 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.506299 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506257 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.506494 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506322 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.506494 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506348 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.506494 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.506494 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-config-out\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.506702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506510 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.506702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506539 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.506702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506582 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.506702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-config\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.506702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-web-config\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.506702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506676 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.507742 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506708 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.507742 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506751 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.507742 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506777 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.507742 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506850 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.507742 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506883 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k99tt\" (UniqueName: \"kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-kube-api-access-k99tt\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.507742 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.507742 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.506949 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.507742 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.507671 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.510672 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.510130 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.510672 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.510353 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.511200 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.510882 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.512240 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.512217 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-config-out\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.513307 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.512866 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.513307 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.513267 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.513990 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.513853 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.514980 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.514236 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.514980 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.514690 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.515953 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.515929 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-web-config\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.516718 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.516678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.516812 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.516784 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-config\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.517646 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.517623 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.517733 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.517692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.517974 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.517954 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.518520 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.518495 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.526089 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.525130 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k99tt\" (UniqueName: \"kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-kube-api-access-k99tt\") pod \"prometheus-k8s-0\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.634084 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.633979 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:38.801134 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:38.799526 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:56:38.802910 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:56:38.802876 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d524af_03d1_4085_a649_4d273222e4ed.slice/crio-0736c0829bce78f7dc606b4fe4bba7703dc9959072351da76111f723a66367a7 WatchSource:0}: Error finding container 0736c0829bce78f7dc606b4fe4bba7703dc9959072351da76111f723a66367a7: Status 404 returned error can't find the container with id 0736c0829bce78f7dc606b4fe4bba7703dc9959072351da76111f723a66367a7 Apr 23 17:56:39.346117 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:39.346083 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerStarted","Data":"0736c0829bce78f7dc606b4fe4bba7703dc9959072351da76111f723a66367a7"} Apr 23 17:56:40.349994 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:40.349962 2579 generic.go:358] "Generic (PLEG): container finished" podID="f1d524af-03d1-4085-a649-4d273222e4ed" containerID="2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8" exitCode=0 Apr 23 17:56:40.350440 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:40.350051 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerDied","Data":"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8"} Apr 23 17:56:40.352057 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:40.352033 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" event={"ID":"1048dfda-10d2-4413-b065-476808e431a8","Type":"ContainerStarted","Data":"b8aedc103413819e6c0b4ee83776ce1d4e6487fa7307c3ebf2a1847da09624f8"} Apr 23 17:56:40.352162 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:40.352074 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" event={"ID":"1048dfda-10d2-4413-b065-476808e431a8","Type":"ContainerStarted","Data":"d69dcf9fe8a3d31b8967513bc20c70ad5778eddfd1b1242eba89304b6f85f316"} Apr 23 17:56:40.352162 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:40.352084 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" event={"ID":"1048dfda-10d2-4413-b065-476808e431a8","Type":"ContainerStarted","Data":"e4dd7c554172d9bf72e4d82e8565c9be29dd9f56274e292f46d67891aff15c91"} Apr 23 17:56:40.404494 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:40.404439 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5dd6cfc8f5-9ll9g" podStartSLOduration=1.281035863 podStartE2EDuration="3.404425331s" podCreationTimestamp="2026-04-23 17:56:37 +0000 UTC" firstStartedPulling="2026-04-23 17:56:37.740710426 +0000 UTC m=+242.556076052" lastFinishedPulling="2026-04-23 17:56:39.864099879 +0000 UTC m=+244.679465520" observedRunningTime="2026-04-23 17:56:40.403427537 +0000 UTC m=+245.218793188" watchObservedRunningTime="2026-04-23 17:56:40.404425331 +0000 UTC m=+245.219790978" Apr 23 17:56:43.349587 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:43.349565 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-65b888866-skbbk" Apr 23 17:56:43.366963 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:43.366934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerStarted","Data":"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7"} Apr 23 17:56:43.367121 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:43.367102 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerStarted","Data":"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5"} Apr 23 17:56:43.367225 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:43.367209 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerStarted","Data":"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450"} Apr 23 17:56:43.367311 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:43.367296 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerStarted","Data":"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b"} Apr 23 17:56:44.267001 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:44.266888 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:56:44.373719 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:44.373682 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerStarted","Data":"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120"} Apr 23 17:56:44.373719 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:44.373723 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerStarted","Data":"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8"} Apr 23 17:56:44.408625 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:44.408559 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.236568971 podStartE2EDuration="6.408538691s" podCreationTimestamp="2026-04-23 17:56:38 +0000 UTC" firstStartedPulling="2026-04-23 17:56:38.805476755 +0000 UTC m=+243.620842385" lastFinishedPulling="2026-04-23 17:56:42.977446465 +0000 UTC m=+247.792812105" observedRunningTime="2026-04-23 17:56:44.406045412 +0000 UTC m=+249.221411082" watchObservedRunningTime="2026-04-23 17:56:44.408538691 +0000 UTC m=+249.223904338" Apr 23 17:56:44.958588 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:44.958556 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6867cfcb6c-chk7c"] Apr 23 17:56:48.634429 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:56:48.634388 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:09.978041 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:09.977977 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" podUID="af35427b-d504-4ded-b482-891528bafad3" containerName="registry" containerID="cri-o://f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094" gracePeriod=30 Apr 23 17:57:10.216484 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.216463 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:57:10.275818 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.275783 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-image-registry-private-configuration\") pod \"af35427b-d504-4ded-b482-891528bafad3\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " Apr 23 17:57:10.275995 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.275884 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-bound-sa-token\") pod \"af35427b-d504-4ded-b482-891528bafad3\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " Apr 23 17:57:10.275995 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.275925 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rthwk\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-kube-api-access-rthwk\") pod \"af35427b-d504-4ded-b482-891528bafad3\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " Apr 23 17:57:10.275995 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.275950 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-installation-pull-secrets\") pod \"af35427b-d504-4ded-b482-891528bafad3\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " Apr 23 17:57:10.275995 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.275980 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls\") pod \"af35427b-d504-4ded-b482-891528bafad3\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " Apr 23 17:57:10.276198 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.276031 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af35427b-d504-4ded-b482-891528bafad3-ca-trust-extracted\") pod \"af35427b-d504-4ded-b482-891528bafad3\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " Apr 23 17:57:10.276198 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.276088 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-registry-certificates\") pod \"af35427b-d504-4ded-b482-891528bafad3\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " Apr 23 17:57:10.276198 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.276117 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-trusted-ca\") pod \"af35427b-d504-4ded-b482-891528bafad3\" (UID: \"af35427b-d504-4ded-b482-891528bafad3\") " Apr 23 17:57:10.276944 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.276898 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "af35427b-d504-4ded-b482-891528bafad3" (UID: "af35427b-d504-4ded-b482-891528bafad3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:10.277083 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.276965 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "af35427b-d504-4ded-b482-891528bafad3" (UID: "af35427b-d504-4ded-b482-891528bafad3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:10.278558 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.278521 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "af35427b-d504-4ded-b482-891528bafad3" (UID: "af35427b-d504-4ded-b482-891528bafad3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:10.278558 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.278527 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-kube-api-access-rthwk" (OuterVolumeSpecName: "kube-api-access-rthwk") pod "af35427b-d504-4ded-b482-891528bafad3" (UID: "af35427b-d504-4ded-b482-891528bafad3"). InnerVolumeSpecName "kube-api-access-rthwk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:57:10.278699 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.278633 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "af35427b-d504-4ded-b482-891528bafad3" (UID: "af35427b-d504-4ded-b482-891528bafad3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:57:10.278862 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.278807 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "af35427b-d504-4ded-b482-891528bafad3" (UID: "af35427b-d504-4ded-b482-891528bafad3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:10.279031 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.279001 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "af35427b-d504-4ded-b482-891528bafad3" (UID: "af35427b-d504-4ded-b482-891528bafad3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:57:10.285021 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.284994 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af35427b-d504-4ded-b482-891528bafad3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "af35427b-d504-4ded-b482-891528bafad3" (UID: "af35427b-d504-4ded-b482-891528bafad3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:57:10.377226 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.377200 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-bound-sa-token\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:10.377226 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.377225 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rthwk\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-kube-api-access-rthwk\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:10.377376 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.377236 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-installation-pull-secrets\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:10.377376 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.377247 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af35427b-d504-4ded-b482-891528bafad3-registry-tls\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:10.377376 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.377255 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af35427b-d504-4ded-b482-891528bafad3-ca-trust-extracted\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:10.377376 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.377264 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-registry-certificates\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:10.377376 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.377273 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af35427b-d504-4ded-b482-891528bafad3-trusted-ca\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:10.377376 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.377282 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/af35427b-d504-4ded-b482-891528bafad3-image-registry-private-configuration\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:10.456242 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.456209 2579 generic.go:358] "Generic (PLEG): container finished" podID="af35427b-d504-4ded-b482-891528bafad3" containerID="f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094" exitCode=0 Apr 23 17:57:10.456378 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.456274 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" Apr 23 17:57:10.456378 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.456273 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" event={"ID":"af35427b-d504-4ded-b482-891528bafad3","Type":"ContainerDied","Data":"f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094"} Apr 23 17:57:10.456378 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.456374 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6867cfcb6c-chk7c" event={"ID":"af35427b-d504-4ded-b482-891528bafad3","Type":"ContainerDied","Data":"cd69ba809c9bec650b15f96e7ec1cf7a665031c8338160915412d28ecced8e28"} Apr 23 17:57:10.456506 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.456389 2579 scope.go:117] "RemoveContainer" containerID="f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094" Apr 23 17:57:10.465414 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.465396 2579 scope.go:117] "RemoveContainer" containerID="f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094" Apr 23 17:57:10.465673 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:57:10.465647 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094\": container with ID starting with f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094 not found: ID does not exist" containerID="f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094" Apr 23 17:57:10.465724 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.465677 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094"} err="failed to get container status \"f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094\": rpc error: code = NotFound desc = could not find container \"f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094\": container with ID starting with f8a1ce8346d49bdbcdc0eb55393005bd6879d6fd052070bc888780937f55e094 not found: ID does not exist" Apr 23 17:57:10.477677 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.477656 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6867cfcb6c-chk7c"] Apr 23 17:57:10.483244 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:10.483223 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6867cfcb6c-chk7c"] Apr 23 17:57:11.686685 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:11.686638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:57:11.689166 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:11.689147 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67c4e3ae-cc88-433d-8549-c77153e2e1d6-metrics-certs\") pod \"network-metrics-daemon-gnxs9\" (UID: \"67c4e3ae-cc88-433d-8549-c77153e2e1d6\") " pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:57:11.715995 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:11.715971 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-s85dw\"" Apr 23 17:57:11.723170 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:11.723141 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gnxs9" Apr 23 17:57:11.804638 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:11.804602 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af35427b-d504-4ded-b482-891528bafad3" path="/var/lib/kubelet/pods/af35427b-d504-4ded-b482-891528bafad3/volumes" Apr 23 17:57:11.846007 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:11.845969 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gnxs9"] Apr 23 17:57:11.848706 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:57:11.848673 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c4e3ae_cc88_433d_8549_c77153e2e1d6.slice/crio-25df55b88d145ae56b20d1543a436b2bb5f281642ab37c3fffc404039a939001 WatchSource:0}: Error finding container 25df55b88d145ae56b20d1543a436b2bb5f281642ab37c3fffc404039a939001: Status 404 returned error can't find the container with id 25df55b88d145ae56b20d1543a436b2bb5f281642ab37c3fffc404039a939001 Apr 23 17:57:12.464385 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:12.464349 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gnxs9" event={"ID":"67c4e3ae-cc88-433d-8549-c77153e2e1d6","Type":"ContainerStarted","Data":"25df55b88d145ae56b20d1543a436b2bb5f281642ab37c3fffc404039a939001"} Apr 23 17:57:13.468566 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:13.468478 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gnxs9" event={"ID":"67c4e3ae-cc88-433d-8549-c77153e2e1d6","Type":"ContainerStarted","Data":"94d4495ce61a98ed85c575f20793411469ca9d9aa78d3c81d12a54a9be56838a"} Apr 23 17:57:13.468566 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:13.468514 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gnxs9" event={"ID":"67c4e3ae-cc88-433d-8549-c77153e2e1d6","Type":"ContainerStarted","Data":"cb1dbd7c07b410c7037ad627c2f9ad4bb88ea72a4685d660533c31fddb9fc89c"} Apr 23 17:57:13.486198 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:13.486128 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gnxs9" podStartSLOduration=129.464938582 podStartE2EDuration="2m10.486110465s" podCreationTimestamp="2026-04-23 17:55:03 +0000 UTC" firstStartedPulling="2026-04-23 17:57:11.850464875 +0000 UTC m=+276.665830501" lastFinishedPulling="2026-04-23 17:57:12.871636756 +0000 UTC m=+277.687002384" observedRunningTime="2026-04-23 17:57:13.484675941 +0000 UTC m=+278.300041586" watchObservedRunningTime="2026-04-23 17:57:13.486110465 +0000 UTC m=+278.301476113" Apr 23 17:57:35.696443 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:35.696424 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 17:57:38.635131 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:38.635076 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:38.654346 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:38.654315 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:39.559332 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:39.559300 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:54.273615 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.273575 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:57:54.274116 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.274095 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af35427b-d504-4ded-b482-891528bafad3" containerName="registry" Apr 23 17:57:54.274192 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.274120 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="af35427b-d504-4ded-b482-891528bafad3" containerName="registry" Apr 23 17:57:54.274246 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.274222 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="af35427b-d504-4ded-b482-891528bafad3" containerName="registry" Apr 23 17:57:54.276867 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.276845 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.279610 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.279585 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 17:57:54.279739 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.279620 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 17:57:54.279739 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.279723 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 17:57:54.279876 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.279739 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 17:57:54.280031 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.279996 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dv2fr\"" Apr 23 17:57:54.280031 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.280001 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 17:57:54.280189 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.280037 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 17:57:54.280189 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.280007 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 17:57:54.280820 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.280801 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 17:57:54.285086 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.285067 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 17:57:54.290907 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.290884 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:57:54.335992 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.335959 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.335992 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.335995 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5s6x\" (UniqueName: \"kubernetes.io/projected/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-kube-api-access-j5s6x\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.336234 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.336024 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-web-config\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.336234 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.336047 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.336234 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.336136 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.336234 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.336174 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.336234 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.336206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.336479 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.336236 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.336479 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.336265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.336479 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.336293 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-config-volume\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.336479 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.336326 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.336479 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.336375 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-config-out\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.336479 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.336410 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.437407 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.437375 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.437407 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.437411 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.437657 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.437431 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.437657 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.437449 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.437657 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.437468 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.437657 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.437486 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-config-volume\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.437657 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.437512 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.437657 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.437553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-config-out\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.437657 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.437591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.438003 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.437977 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.438074 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.438043 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.438141 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.438081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5s6x\" (UniqueName: \"kubernetes.io/projected/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-kube-api-access-j5s6x\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.438141 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.438135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-web-config\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.438247 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.438161 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.438655 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.438604 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.439766 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.439736 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.441009 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.440979 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-config-out\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.441122 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.441012 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-config-volume\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.441487 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.441458 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.441590 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.441503 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.441682 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.441663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.441740 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.441702 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.441893 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.441878 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-web-config\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.442114 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.442095 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.442847 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.442796 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.447358 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.447337 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5s6x\" (UniqueName: \"kubernetes.io/projected/03db3ae9-3bb0-4846-be9b-3e502d6af2ea-kube-api-access-j5s6x\") pod \"alertmanager-main-0\" (UID: \"03db3ae9-3bb0-4846-be9b-3e502d6af2ea\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.586736 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.586659 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:57:54.716366 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.716342 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:57:54.719111 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:57:54.719078 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03db3ae9_3bb0_4846_be9b_3e502d6af2ea.slice/crio-07505de7af3f5bcf13d74a0b0f15b3ac79cde5557a5be62f73f6c484acd3e69b WatchSource:0}: Error finding container 07505de7af3f5bcf13d74a0b0f15b3ac79cde5557a5be62f73f6c484acd3e69b: Status 404 returned error can't find the container with id 07505de7af3f5bcf13d74a0b0f15b3ac79cde5557a5be62f73f6c484acd3e69b Apr 23 17:57:54.721267 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:54.721246 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:57:55.585924 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:55.585878 2579 generic.go:358] "Generic (PLEG): container finished" podID="03db3ae9-3bb0-4846-be9b-3e502d6af2ea" containerID="cf038f2c77ba2d929b4e5069c0a1d69deab7686fc5f64b3844fcf918238beec2" exitCode=0 Apr 23 17:57:55.586298 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:55.585968 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03db3ae9-3bb0-4846-be9b-3e502d6af2ea","Type":"ContainerDied","Data":"cf038f2c77ba2d929b4e5069c0a1d69deab7686fc5f64b3844fcf918238beec2"} Apr 23 17:57:55.586298 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:55.586003 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03db3ae9-3bb0-4846-be9b-3e502d6af2ea","Type":"ContainerStarted","Data":"07505de7af3f5bcf13d74a0b0f15b3ac79cde5557a5be62f73f6c484acd3e69b"} Apr 23 17:57:56.718717 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:56.718458 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:57:56.719233 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:56.719072 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="prometheus" containerID="cri-o://b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b" gracePeriod=600 Apr 23 17:57:56.719233 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:56.719120 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy-web" containerID="cri-o://02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7" gracePeriod=600 Apr 23 17:57:56.719355 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:56.719217 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="thanos-sidecar" containerID="cri-o://040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5" gracePeriod=600 Apr 23 17:57:56.719355 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:56.719292 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="config-reloader" containerID="cri-o://cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450" gracePeriod=600 Apr 23 17:57:56.719445 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:56.719427 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy" containerID="cri-o://b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8" gracePeriod=600 Apr 23 17:57:56.719556 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:56.719287 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy-thanos" containerID="cri-o://7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120" gracePeriod=600 Apr 23 17:57:57.268987 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.268963 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.362297 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362273 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-kubelet-serving-ca-bundle\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362409 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362320 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-trusted-ca-bundle\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362409 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362349 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362409 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362387 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-config-out\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362624 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362420 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-serving-certs-ca-bundle\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362624 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362445 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-tls-assets\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362624 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362491 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-thanos-prometheus-http-client-file\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362624 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362519 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-config\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362624 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362552 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-web-config\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362624 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362579 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-grpc-tls\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362624 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362611 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-metrics-client-ca\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362972 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362640 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k99tt\" (UniqueName: \"kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-kube-api-access-k99tt\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362972 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362671 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-kube-rbac-proxy\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362972 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362708 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-metrics-client-certs\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362972 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362741 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-rulefiles-0\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362972 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362731 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:57.362972 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362754 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:57.362972 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362778 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362972 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362844 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-tls\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.362972 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.362870 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-db\") pod \"f1d524af-03d1-4085-a649-4d273222e4ed\" (UID: \"f1d524af-03d1-4085-a649-4d273222e4ed\") " Apr 23 17:57:57.363381 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.363098 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.363381 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.363118 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-trusted-ca-bundle\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.363628 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.363602 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:57.364117 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.364094 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:57:57.365658 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.365385 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:57.365921 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.365896 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-config-out" (OuterVolumeSpecName: "config-out") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:57:57.366009 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.365963 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:57.366254 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.366227 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:57:57.367666 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.367630 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:57:57.367914 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.367884 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:57.368003 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.367936 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:57.368123 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.368064 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-config" (OuterVolumeSpecName: "config") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:57.368184 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.368114 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:57.368504 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.368476 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:57.368974 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.368942 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-kube-api-access-k99tt" (OuterVolumeSpecName: "kube-api-access-k99tt") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "kube-api-access-k99tt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:57:57.369890 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.369846 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:57.370715 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.370688 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:57.381184 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.381161 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-web-config" (OuterVolumeSpecName: "web-config") pod "f1d524af-03d1-4085-a649-4d273222e4ed" (UID: "f1d524af-03d1-4085-a649-4d273222e4ed"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:57:57.463488 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463467 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463559 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463496 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-tls-assets\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463559 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463513 2579 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-thanos-prometheus-http-client-file\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463559 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463525 2579 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-config\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463559 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463534 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-web-config\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463559 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463543 2579 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-grpc-tls\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463559 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463553 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-configmap-metrics-client-ca\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463559 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463562 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k99tt\" (UniqueName: \"kubernetes.io/projected/f1d524af-03d1-4085-a649-4d273222e4ed-kube-api-access-k99tt\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463802 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463572 2579 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-kube-rbac-proxy\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463802 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463585 2579 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-metrics-client-certs\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463802 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463599 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463802 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463612 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463802 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463622 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-tls\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463802 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463632 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-prometheus-k8s-db\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463802 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463641 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1d524af-03d1-4085-a649-4d273222e4ed-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.463802 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.463649 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1d524af-03d1-4085-a649-4d273222e4ed-config-out\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 17:57:57.595251 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595221 2579 generic.go:358] "Generic (PLEG): container finished" podID="f1d524af-03d1-4085-a649-4d273222e4ed" containerID="7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120" exitCode=0 Apr 23 17:57:57.595251 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595249 2579 generic.go:358] "Generic (PLEG): container finished" podID="f1d524af-03d1-4085-a649-4d273222e4ed" containerID="b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8" exitCode=0 Apr 23 17:57:57.595405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595256 2579 generic.go:358] "Generic (PLEG): container finished" podID="f1d524af-03d1-4085-a649-4d273222e4ed" containerID="02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7" exitCode=0 Apr 23 17:57:57.595405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595263 2579 generic.go:358] "Generic (PLEG): container finished" podID="f1d524af-03d1-4085-a649-4d273222e4ed" containerID="040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5" exitCode=0 Apr 23 17:57:57.595405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595267 2579 generic.go:358] "Generic (PLEG): container finished" podID="f1d524af-03d1-4085-a649-4d273222e4ed" containerID="cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450" exitCode=0 Apr 23 17:57:57.595405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595272 2579 generic.go:358] "Generic (PLEG): container finished" podID="f1d524af-03d1-4085-a649-4d273222e4ed" containerID="b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b" exitCode=0 Apr 23 17:57:57.595405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595302 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerDied","Data":"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120"} Apr 23 17:57:57.595405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595323 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.595405 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595372 2579 scope.go:117] "RemoveContainer" containerID="7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120" Apr 23 17:57:57.595685 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595358 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerDied","Data":"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8"} Apr 23 17:57:57.595685 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595486 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerDied","Data":"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7"} Apr 23 17:57:57.595685 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595512 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerDied","Data":"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5"} Apr 23 17:57:57.595685 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595527 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerDied","Data":"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450"} Apr 23 17:57:57.595685 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerDied","Data":"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b"} Apr 23 17:57:57.595685 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.595555 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f1d524af-03d1-4085-a649-4d273222e4ed","Type":"ContainerDied","Data":"0736c0829bce78f7dc606b4fe4bba7703dc9959072351da76111f723a66367a7"} Apr 23 17:57:57.598875 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.598855 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03db3ae9-3bb0-4846-be9b-3e502d6af2ea","Type":"ContainerStarted","Data":"1f4bcd8039d6ab3587d6a28c91504a904a0b4b93fa34f122975580a2264d186e"} Apr 23 17:57:57.599033 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.598880 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03db3ae9-3bb0-4846-be9b-3e502d6af2ea","Type":"ContainerStarted","Data":"9bf61bbc32d071a98acb46e58864139f709fe434522fa9df2cd81b0856cd4aff"} Apr 23 17:57:57.599033 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.598896 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03db3ae9-3bb0-4846-be9b-3e502d6af2ea","Type":"ContainerStarted","Data":"8cb16912387b63c2d7eb621c6644cb0020fe6f22718bfe57f548888796c839ca"} Apr 23 17:57:57.599033 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.598907 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03db3ae9-3bb0-4846-be9b-3e502d6af2ea","Type":"ContainerStarted","Data":"a6136324d079ebeb338d935cd1986aa8023cc729b75ad811ed2a8382a50b28ce"} Apr 23 17:57:57.599033 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.598915 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03db3ae9-3bb0-4846-be9b-3e502d6af2ea","Type":"ContainerStarted","Data":"088435df5b9609c2b6161871e5b123c0619a78caf39f146f8ff856c7fd886125"} Apr 23 17:57:57.607605 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.607563 2579 scope.go:117] "RemoveContainer" containerID="b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8" Apr 23 17:57:57.624937 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.624379 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:57:57.628249 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.627895 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:57:57.635358 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.635339 2579 scope.go:117] "RemoveContainer" containerID="02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7" Apr 23 17:57:57.642976 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.642957 2579 scope.go:117] "RemoveContainer" containerID="040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5" Apr 23 17:57:57.650185 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.650159 2579 scope.go:117] "RemoveContainer" containerID="cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450" Apr 23 17:57:57.654356 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654336 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:57:57.654763 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654733 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="prometheus" Apr 23 17:57:57.654763 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654756 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="prometheus" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654777 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="config-reloader" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654785 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="config-reloader" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654809 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="thanos-sidecar" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654818 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="thanos-sidecar" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654857 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654864 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654877 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy-thanos" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654885 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy-thanos" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654900 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy-web" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654909 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy-web" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654921 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="init-config-reloader" Apr 23 17:57:57.654927 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.654929 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="init-config-reloader" Apr 23 17:57:57.655704 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.655038 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy" Apr 23 17:57:57.655704 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.655054 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="config-reloader" Apr 23 17:57:57.655704 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.655065 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy-web" Apr 23 17:57:57.655704 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.655076 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="kube-rbac-proxy-thanos" Apr 23 17:57:57.655704 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.655086 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="prometheus" Apr 23 17:57:57.655704 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.655094 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" containerName="thanos-sidecar" Apr 23 17:57:57.658676 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.658659 2579 scope.go:117] "RemoveContainer" containerID="b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b" Apr 23 17:57:57.659791 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.659776 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.662720 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.662673 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 17:57:57.662720 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.662715 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 17:57:57.662908 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.662733 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 17:57:57.663192 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.663172 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6blulatql74b7\"" Apr 23 17:57:57.663286 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.663229 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 17:57:57.663570 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.663552 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 17:57:57.663655 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.663576 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 17:57:57.663655 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.663600 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 17:57:57.663760 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.663556 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 17:57:57.663889 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.663874 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rs2hc\"" Apr 23 17:57:57.664015 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.663996 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 17:57:57.664100 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.664083 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 17:57:57.664789 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.664771 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 17:57:57.666471 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.666442 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 17:57:57.668240 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.668135 2579 scope.go:117] "RemoveContainer" containerID="2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8" Apr 23 17:57:57.669596 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.669431 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 17:57:57.671576 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.671558 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:57:57.676844 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.676805 2579 scope.go:117] "RemoveContainer" containerID="7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120" Apr 23 17:57:57.677170 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:57:57.677152 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": container with ID starting with 7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120 not found: ID does not exist" containerID="7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120" Apr 23 17:57:57.677242 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.677182 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120"} err="failed to get container status \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": rpc error: code = NotFound desc = could not find container \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": container with ID starting with 7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120 not found: ID does not exist" Apr 23 17:57:57.677242 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.677206 2579 scope.go:117] "RemoveContainer" containerID="b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8" Apr 23 17:57:57.677484 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:57:57.677466 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": container with ID starting with b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8 not found: ID does not exist" containerID="b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8" Apr 23 17:57:57.677519 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.677489 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8"} err="failed to get container status \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": rpc error: code = NotFound desc = could not find container \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": container with ID starting with b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8 not found: ID does not exist" Apr 23 17:57:57.677519 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.677506 2579 scope.go:117] "RemoveContainer" containerID="02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7" Apr 23 17:57:57.677706 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:57:57.677687 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": container with ID starting with 02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7 not found: ID does not exist" containerID="02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7" Apr 23 17:57:57.677807 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.677712 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7"} err="failed to get container status \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": rpc error: code = NotFound desc = could not find container \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": container with ID starting with 02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7 not found: ID does not exist" Apr 23 17:57:57.677807 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.677730 2579 scope.go:117] "RemoveContainer" containerID="040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5" Apr 23 17:57:57.677967 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:57:57.677949 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": container with ID starting with 040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5 not found: ID does not exist" containerID="040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5" Apr 23 17:57:57.678011 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.677971 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5"} err="failed to get container status \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": rpc error: code = NotFound desc = could not find container \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": container with ID starting with 040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5 not found: ID does not exist" Apr 23 17:57:57.678011 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.677987 2579 scope.go:117] "RemoveContainer" containerID="cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450" Apr 23 17:57:57.678206 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:57:57.678186 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": container with ID starting with cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450 not found: ID does not exist" containerID="cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450" Apr 23 17:57:57.678264 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.678216 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450"} err="failed to get container status \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": rpc error: code = NotFound desc = could not find container \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": container with ID starting with cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450 not found: ID does not exist" Apr 23 17:57:57.678264 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.678235 2579 scope.go:117] "RemoveContainer" containerID="b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b" Apr 23 17:57:57.678551 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:57:57.678525 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": container with ID starting with b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b not found: ID does not exist" containerID="b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b" Apr 23 17:57:57.678629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.678559 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b"} err="failed to get container status \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": rpc error: code = NotFound desc = could not find container \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": container with ID starting with b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b not found: ID does not exist" Apr 23 17:57:57.678629 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.678582 2579 scope.go:117] "RemoveContainer" containerID="2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8" Apr 23 17:57:57.678813 ip-10-0-130-202 kubenswrapper[2579]: E0423 17:57:57.678797 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": container with ID starting with 2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8 not found: ID does not exist" containerID="2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8" Apr 23 17:57:57.678879 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.678816 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8"} err="failed to get container status \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": rpc error: code = NotFound desc = could not find container \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": container with ID starting with 2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8 not found: ID does not exist" Apr 23 17:57:57.678879 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.678860 2579 scope.go:117] "RemoveContainer" containerID="7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120" Apr 23 17:57:57.679063 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.679046 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120"} err="failed to get container status \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": rpc error: code = NotFound desc = could not find container \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": container with ID starting with 7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120 not found: ID does not exist" Apr 23 17:57:57.679125 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.679064 2579 scope.go:117] "RemoveContainer" containerID="b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8" Apr 23 17:57:57.679296 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.679278 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8"} err="failed to get container status \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": rpc error: code = NotFound desc = could not find container \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": container with ID starting with b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8 not found: ID does not exist" Apr 23 17:57:57.679339 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.679298 2579 scope.go:117] "RemoveContainer" containerID="02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7" Apr 23 17:57:57.679513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.679493 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7"} err="failed to get container status \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": rpc error: code = NotFound desc = could not find container \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": container with ID starting with 02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7 not found: ID does not exist" Apr 23 17:57:57.679513 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.679513 2579 scope.go:117] "RemoveContainer" containerID="040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5" Apr 23 17:57:57.679695 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.679680 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5"} err="failed to get container status \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": rpc error: code = NotFound desc = could not find container \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": container with ID starting with 040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5 not found: ID does not exist" Apr 23 17:57:57.679745 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.679697 2579 scope.go:117] "RemoveContainer" containerID="cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450" Apr 23 17:57:57.679893 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.679875 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450"} err="failed to get container status \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": rpc error: code = NotFound desc = could not find container \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": container with ID starting with cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450 not found: ID does not exist" Apr 23 17:57:57.679951 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.679894 2579 scope.go:117] "RemoveContainer" containerID="b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b" Apr 23 17:57:57.680096 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.680076 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b"} err="failed to get container status \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": rpc error: code = NotFound desc = could not find container \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": container with ID starting with b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b not found: ID does not exist" Apr 23 17:57:57.680136 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.680098 2579 scope.go:117] "RemoveContainer" containerID="2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8" Apr 23 17:57:57.680289 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.680274 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8"} err="failed to get container status \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": rpc error: code = NotFound desc = could not find container \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": container with ID starting with 2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8 not found: ID does not exist" Apr 23 17:57:57.680331 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.680290 2579 scope.go:117] "RemoveContainer" containerID="7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120" Apr 23 17:57:57.680458 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.680439 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120"} err="failed to get container status \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": rpc error: code = NotFound desc = could not find container \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": container with ID starting with 7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120 not found: ID does not exist" Apr 23 17:57:57.680503 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.680459 2579 scope.go:117] "RemoveContainer" containerID="b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8" Apr 23 17:57:57.680622 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.680607 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8"} err="failed to get container status \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": rpc error: code = NotFound desc = could not find container \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": container with ID starting with b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8 not found: ID does not exist" Apr 23 17:57:57.680662 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.680622 2579 scope.go:117] "RemoveContainer" containerID="02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7" Apr 23 17:57:57.680805 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.680791 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7"} err="failed to get container status \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": rpc error: code = NotFound desc = could not find container \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": container with ID starting with 02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7 not found: ID does not exist" Apr 23 17:57:57.680876 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.680806 2579 scope.go:117] "RemoveContainer" containerID="040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5" Apr 23 17:57:57.681072 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.681053 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5"} err="failed to get container status \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": rpc error: code = NotFound desc = could not find container \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": container with ID starting with 040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5 not found: ID does not exist" Apr 23 17:57:57.681114 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.681073 2579 scope.go:117] "RemoveContainer" containerID="cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450" Apr 23 17:57:57.681281 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.681267 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450"} err="failed to get container status \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": rpc error: code = NotFound desc = could not find container \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": container with ID starting with cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450 not found: ID does not exist" Apr 23 17:57:57.681319 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.681281 2579 scope.go:117] "RemoveContainer" containerID="b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b" Apr 23 17:57:57.681490 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.681473 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b"} err="failed to get container status \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": rpc error: code = NotFound desc = could not find container \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": container with ID starting with b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b not found: ID does not exist" Apr 23 17:57:57.681532 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.681492 2579 scope.go:117] "RemoveContainer" containerID="2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8" Apr 23 17:57:57.681687 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.681670 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8"} err="failed to get container status \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": rpc error: code = NotFound desc = could not find container \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": container with ID starting with 2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8 not found: ID does not exist" Apr 23 17:57:57.681733 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.681687 2579 scope.go:117] "RemoveContainer" containerID="7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120" Apr 23 17:57:57.681910 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.681895 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120"} err="failed to get container status \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": rpc error: code = NotFound desc = could not find container \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": container with ID starting with 7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120 not found: ID does not exist" Apr 23 17:57:57.681963 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.681911 2579 scope.go:117] "RemoveContainer" containerID="b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8" Apr 23 17:57:57.682101 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.682080 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8"} err="failed to get container status \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": rpc error: code = NotFound desc = could not find container \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": container with ID starting with b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8 not found: ID does not exist" Apr 23 17:57:57.682168 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.682103 2579 scope.go:117] "RemoveContainer" containerID="02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7" Apr 23 17:57:57.682296 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.682279 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7"} err="failed to get container status \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": rpc error: code = NotFound desc = could not find container \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": container with ID starting with 02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7 not found: ID does not exist" Apr 23 17:57:57.682338 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.682296 2579 scope.go:117] "RemoveContainer" containerID="040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5" Apr 23 17:57:57.682590 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.682559 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5"} err="failed to get container status \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": rpc error: code = NotFound desc = could not find container \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": container with ID starting with 040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5 not found: ID does not exist" Apr 23 17:57:57.682672 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.682593 2579 scope.go:117] "RemoveContainer" containerID="cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450" Apr 23 17:57:57.683050 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.682969 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450"} err="failed to get container status \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": rpc error: code = NotFound desc = could not find container \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": container with ID starting with cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450 not found: ID does not exist" Apr 23 17:57:57.683050 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.682995 2579 scope.go:117] "RemoveContainer" containerID="b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b" Apr 23 17:57:57.684288 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.684261 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b"} err="failed to get container status \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": rpc error: code = NotFound desc = could not find container \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": container with ID starting with b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b not found: ID does not exist" Apr 23 17:57:57.684397 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.684289 2579 scope.go:117] "RemoveContainer" containerID="2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8" Apr 23 17:57:57.684578 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.684554 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8"} err="failed to get container status \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": rpc error: code = NotFound desc = could not find container \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": container with ID starting with 2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8 not found: ID does not exist" Apr 23 17:57:57.684646 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.684580 2579 scope.go:117] "RemoveContainer" containerID="7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120" Apr 23 17:57:57.684873 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.684845 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120"} err="failed to get container status \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": rpc error: code = NotFound desc = could not find container \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": container with ID starting with 7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120 not found: ID does not exist" Apr 23 17:57:57.684952 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.684875 2579 scope.go:117] "RemoveContainer" containerID="b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8" Apr 23 17:57:57.685166 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.685149 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8"} err="failed to get container status \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": rpc error: code = NotFound desc = could not find container \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": container with ID starting with b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8 not found: ID does not exist" Apr 23 17:57:57.685237 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.685167 2579 scope.go:117] "RemoveContainer" containerID="02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7" Apr 23 17:57:57.685774 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.685730 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7"} err="failed to get container status \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": rpc error: code = NotFound desc = could not find container \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": container with ID starting with 02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7 not found: ID does not exist" Apr 23 17:57:57.685774 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.685764 2579 scope.go:117] "RemoveContainer" containerID="040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5" Apr 23 17:57:57.686093 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.686075 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5"} err="failed to get container status \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": rpc error: code = NotFound desc = could not find container \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": container with ID starting with 040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5 not found: ID does not exist" Apr 23 17:57:57.686178 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.686094 2579 scope.go:117] "RemoveContainer" containerID="cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450" Apr 23 17:57:57.686340 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.686321 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450"} err="failed to get container status \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": rpc error: code = NotFound desc = could not find container \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": container with ID starting with cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450 not found: ID does not exist" Apr 23 17:57:57.686394 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.686341 2579 scope.go:117] "RemoveContainer" containerID="b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b" Apr 23 17:57:57.686592 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.686568 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b"} err="failed to get container status \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": rpc error: code = NotFound desc = could not find container \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": container with ID starting with b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b not found: ID does not exist" Apr 23 17:57:57.686633 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.686595 2579 scope.go:117] "RemoveContainer" containerID="2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8" Apr 23 17:57:57.686938 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.686843 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8"} err="failed to get container status \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": rpc error: code = NotFound desc = could not find container \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": container with ID starting with 2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8 not found: ID does not exist" Apr 23 17:57:57.686938 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.686867 2579 scope.go:117] "RemoveContainer" containerID="7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120" Apr 23 17:57:57.687189 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.687156 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120"} err="failed to get container status \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": rpc error: code = NotFound desc = could not find container \"7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120\": container with ID starting with 7a22148263dc9423d5d83c7d18acaaf9c3c34d32532faee72409cf9ef6be9120 not found: ID does not exist" Apr 23 17:57:57.687189 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.687189 2579 scope.go:117] "RemoveContainer" containerID="b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8" Apr 23 17:57:57.687422 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.687399 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8"} err="failed to get container status \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": rpc error: code = NotFound desc = could not find container \"b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8\": container with ID starting with b0e83e04e307e38b28465e8b3015adeaaef098e760edce49a591e7dc1f32eeb8 not found: ID does not exist" Apr 23 17:57:57.687422 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.687422 2579 scope.go:117] "RemoveContainer" containerID="02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7" Apr 23 17:57:57.687702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.687666 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7"} err="failed to get container status \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": rpc error: code = NotFound desc = could not find container \"02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7\": container with ID starting with 02da76cb58839e2304258ab58fbb1136cf8f904aca165070742b38a44d515ec7 not found: ID does not exist" Apr 23 17:57:57.687702 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.687687 2579 scope.go:117] "RemoveContainer" containerID="040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5" Apr 23 17:57:57.687991 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.687965 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5"} err="failed to get container status \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": rpc error: code = NotFound desc = could not find container \"040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5\": container with ID starting with 040820c58f3547ef694c2defdab71865b31e6095f437962bb02e4c356be7f4e5 not found: ID does not exist" Apr 23 17:57:57.688128 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.687991 2579 scope.go:117] "RemoveContainer" containerID="cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450" Apr 23 17:57:57.688304 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.688261 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450"} err="failed to get container status \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": rpc error: code = NotFound desc = could not find container \"cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450\": container with ID starting with cb58d1f7701e638dabfc68c8316139d135e4acae4932af089071ba5a97815450 not found: ID does not exist" Apr 23 17:57:57.688304 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.688288 2579 scope.go:117] "RemoveContainer" containerID="b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b" Apr 23 17:57:57.688538 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.688520 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b"} err="failed to get container status \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": rpc error: code = NotFound desc = could not find container \"b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b\": container with ID starting with b6cbf8f2f2b9fc0f3d8cc0896c46a0983e537efd8379e1a5932bd8363bb5264b not found: ID does not exist" Apr 23 17:57:57.688601 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.688541 2579 scope.go:117] "RemoveContainer" containerID="2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8" Apr 23 17:57:57.688801 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.688776 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8"} err="failed to get container status \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": rpc error: code = NotFound desc = could not find container \"2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8\": container with ID starting with 2d7a70d162533efb531b5b19b334a3229c0252b0f4505dbea813c2aeecd03dc8 not found: ID does not exist" Apr 23 17:57:57.765431 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765367 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765431 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765397 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765431 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765420 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2079a3b2-c08a-44a6-a451-96a54393ff3b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765460 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765477 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qgbj\" (UniqueName: \"kubernetes.io/projected/2079a3b2-c08a-44a6-a451-96a54393ff3b-kube-api-access-9qgbj\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765517 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2079a3b2-c08a-44a6-a451-96a54393ff3b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765552 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765590 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765700 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-config\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765733 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765764 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765800 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765854 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765880 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-web-config\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.765909 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765903 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2079a3b2-c08a-44a6-a451-96a54393ff3b-config-out\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.766279 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.766279 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.765965 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.806732 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.804016 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d524af-03d1-4085-a649-4d273222e4ed" path="/var/lib/kubelet/pods/f1d524af-03d1-4085-a649-4d273222e4ed/volumes" Apr 23 17:57:57.866466 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2079a3b2-c08a-44a6-a451-96a54393ff3b-config-out\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.866639 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866508 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.866639 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866531 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.866639 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866584 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.866639 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866602 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.866877 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866652 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2079a3b2-c08a-44a6-a451-96a54393ff3b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.866877 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866697 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.866877 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866720 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.866877 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866744 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qgbj\" (UniqueName: \"kubernetes.io/projected/2079a3b2-c08a-44a6-a451-96a54393ff3b-kube-api-access-9qgbj\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.866877 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866781 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2079a3b2-c08a-44a6-a451-96a54393ff3b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.866877 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866810 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.866877 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866870 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.867380 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-config\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.867380 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.866984 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.867380 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.867023 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.867380 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.867056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.867380 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.867106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.867380 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.867129 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-web-config\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.868146 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.867475 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.868305 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.868278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.869244 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.869115 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.870085 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.869995 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.870636 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.870611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2079a3b2-c08a-44a6-a451-96a54393ff3b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.870930 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.870618 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2079a3b2-c08a-44a6-a451-96a54393ff3b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.871188 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.871107 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-web-config\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.871188 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.871146 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2079a3b2-c08a-44a6-a451-96a54393ff3b-config-out\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.871343 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.871316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.871504 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.871426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.871695 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.871671 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.871858 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.871810 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.871940 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.871903 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.872801 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.872780 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.873034 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.873013 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-config\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.873731 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.873710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2079a3b2-c08a-44a6-a451-96a54393ff3b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.874294 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.874275 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2079a3b2-c08a-44a6-a451-96a54393ff3b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.876337 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.876315 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qgbj\" (UniqueName: \"kubernetes.io/projected/2079a3b2-c08a-44a6-a451-96a54393ff3b-kube-api-access-9qgbj\") pod \"prometheus-k8s-0\" (UID: \"2079a3b2-c08a-44a6-a451-96a54393ff3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:57.972472 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:57.972435 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:58.102006 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:58.101983 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:57:58.104234 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:57:58.104204 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2079a3b2_c08a_44a6_a451_96a54393ff3b.slice/crio-cca6b49f868160f0ffab3a04aadadd6747112b07b3d26b40a2e0891c5a5379ac WatchSource:0}: Error finding container cca6b49f868160f0ffab3a04aadadd6747112b07b3d26b40a2e0891c5a5379ac: Status 404 returned error can't find the container with id cca6b49f868160f0ffab3a04aadadd6747112b07b3d26b40a2e0891c5a5379ac Apr 23 17:57:58.605031 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:58.604999 2579 generic.go:358] "Generic (PLEG): container finished" podID="2079a3b2-c08a-44a6-a451-96a54393ff3b" containerID="b85835a2cec18b3baccd5efe5ee74f95ddc92d15788dca0ab9266cedbd71919e" exitCode=0 Apr 23 17:57:58.605193 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:58.605085 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2079a3b2-c08a-44a6-a451-96a54393ff3b","Type":"ContainerDied","Data":"b85835a2cec18b3baccd5efe5ee74f95ddc92d15788dca0ab9266cedbd71919e"} Apr 23 17:57:58.605193 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:58.605116 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2079a3b2-c08a-44a6-a451-96a54393ff3b","Type":"ContainerStarted","Data":"cca6b49f868160f0ffab3a04aadadd6747112b07b3d26b40a2e0891c5a5379ac"} Apr 23 17:57:58.607940 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:58.607916 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03db3ae9-3bb0-4846-be9b-3e502d6af2ea","Type":"ContainerStarted","Data":"2608e8b0bd4a45636e7bd9833d3750cc884d2e81e81686db964935b471a5a3e1"} Apr 23 17:57:58.666920 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:58.666867 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.973763188 podStartE2EDuration="4.666846516s" podCreationTimestamp="2026-04-23 17:57:54 +0000 UTC" firstStartedPulling="2026-04-23 17:57:55.586991385 +0000 UTC m=+320.402357011" lastFinishedPulling="2026-04-23 17:57:57.280074708 +0000 UTC m=+322.095440339" observedRunningTime="2026-04-23 17:57:58.665603459 +0000 UTC m=+323.480969131" watchObservedRunningTime="2026-04-23 17:57:58.666846516 +0000 UTC m=+323.482212158" Apr 23 17:57:59.613540 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:59.613503 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2079a3b2-c08a-44a6-a451-96a54393ff3b","Type":"ContainerStarted","Data":"7a7eefb875e9de1af99994f877d45063de30e8d9b538c4eba4303bda49d402d3"} Apr 23 17:57:59.613540 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:59.613541 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2079a3b2-c08a-44a6-a451-96a54393ff3b","Type":"ContainerStarted","Data":"0f714e39567d41025749994a9f148ba425e169104674980a945e9f8d799d0d0e"} Apr 23 17:57:59.613973 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:59.613553 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2079a3b2-c08a-44a6-a451-96a54393ff3b","Type":"ContainerStarted","Data":"78291414bde791d99f418c7b8149b7491da496fded919146e599aa73895600b7"} Apr 23 17:57:59.613973 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:59.613561 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2079a3b2-c08a-44a6-a451-96a54393ff3b","Type":"ContainerStarted","Data":"010df9582a5a17503b16af3d40795e934bfb754b693e68dc6154e1c86149377f"} Apr 23 17:57:59.613973 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:59.613572 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2079a3b2-c08a-44a6-a451-96a54393ff3b","Type":"ContainerStarted","Data":"f68f45e9cd5ed86988448502e159c1d8bb4e173a3b26540944abd227b5cd2dea"} Apr 23 17:57:59.613973 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:59.613580 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2079a3b2-c08a-44a6-a451-96a54393ff3b","Type":"ContainerStarted","Data":"89849ed723d31d6cffd7bffbd46dbd70ac88895607e154d5e3c2a3b05cbcd484"} Apr 23 17:57:59.649713 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:57:59.649607 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.649587826 podStartE2EDuration="2.649587826s" podCreationTimestamp="2026-04-23 17:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:57:59.646919626 +0000 UTC m=+324.462285275" watchObservedRunningTime="2026-04-23 17:57:59.649587826 +0000 UTC m=+324.464953476" Apr 23 17:58:02.973356 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:02.973304 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:58:25.503912 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.503879 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nfps4"] Apr 23 17:58:25.508263 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.508243 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nfps4" Apr 23 17:58:25.511622 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.511599 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:58:25.514801 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.514768 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nfps4"] Apr 23 17:58:25.598617 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.598583 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ec5a0b72-a897-43b8-a606-7100e1d87870-original-pull-secret\") pod \"global-pull-secret-syncer-nfps4\" (UID: \"ec5a0b72-a897-43b8-a606-7100e1d87870\") " pod="kube-system/global-pull-secret-syncer-nfps4" Apr 23 17:58:25.598795 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.598631 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ec5a0b72-a897-43b8-a606-7100e1d87870-kubelet-config\") pod \"global-pull-secret-syncer-nfps4\" (UID: \"ec5a0b72-a897-43b8-a606-7100e1d87870\") " pod="kube-system/global-pull-secret-syncer-nfps4" Apr 23 17:58:25.598795 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.598650 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ec5a0b72-a897-43b8-a606-7100e1d87870-dbus\") pod \"global-pull-secret-syncer-nfps4\" (UID: \"ec5a0b72-a897-43b8-a606-7100e1d87870\") " pod="kube-system/global-pull-secret-syncer-nfps4" Apr 23 17:58:25.700010 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.699981 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ec5a0b72-a897-43b8-a606-7100e1d87870-original-pull-secret\") pod \"global-pull-secret-syncer-nfps4\" (UID: \"ec5a0b72-a897-43b8-a606-7100e1d87870\") " pod="kube-system/global-pull-secret-syncer-nfps4" Apr 23 17:58:25.700217 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.700029 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ec5a0b72-a897-43b8-a606-7100e1d87870-kubelet-config\") pod \"global-pull-secret-syncer-nfps4\" (UID: \"ec5a0b72-a897-43b8-a606-7100e1d87870\") " pod="kube-system/global-pull-secret-syncer-nfps4" Apr 23 17:58:25.700217 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.700045 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ec5a0b72-a897-43b8-a606-7100e1d87870-dbus\") pod \"global-pull-secret-syncer-nfps4\" (UID: \"ec5a0b72-a897-43b8-a606-7100e1d87870\") " pod="kube-system/global-pull-secret-syncer-nfps4" Apr 23 17:58:25.700217 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.700129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ec5a0b72-a897-43b8-a606-7100e1d87870-kubelet-config\") pod \"global-pull-secret-syncer-nfps4\" (UID: \"ec5a0b72-a897-43b8-a606-7100e1d87870\") " pod="kube-system/global-pull-secret-syncer-nfps4" Apr 23 17:58:25.700217 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.700168 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ec5a0b72-a897-43b8-a606-7100e1d87870-dbus\") pod \"global-pull-secret-syncer-nfps4\" (UID: \"ec5a0b72-a897-43b8-a606-7100e1d87870\") " pod="kube-system/global-pull-secret-syncer-nfps4" Apr 23 17:58:25.702591 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.702560 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ec5a0b72-a897-43b8-a606-7100e1d87870-original-pull-secret\") pod \"global-pull-secret-syncer-nfps4\" (UID: \"ec5a0b72-a897-43b8-a606-7100e1d87870\") " pod="kube-system/global-pull-secret-syncer-nfps4" Apr 23 17:58:25.817721 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.817649 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nfps4" Apr 23 17:58:25.940032 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:25.940008 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nfps4"] Apr 23 17:58:25.943166 ip-10-0-130-202 kubenswrapper[2579]: W0423 17:58:25.943139 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec5a0b72_a897_43b8_a606_7100e1d87870.slice/crio-cb894c43d8585a0a9ca3d7e6a4a2f12ec17472f8c5db878a77272685d615002f WatchSource:0}: Error finding container cb894c43d8585a0a9ca3d7e6a4a2f12ec17472f8c5db878a77272685d615002f: Status 404 returned error can't find the container with id cb894c43d8585a0a9ca3d7e6a4a2f12ec17472f8c5db878a77272685d615002f Apr 23 17:58:26.695265 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:26.695227 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nfps4" event={"ID":"ec5a0b72-a897-43b8-a606-7100e1d87870","Type":"ContainerStarted","Data":"cb894c43d8585a0a9ca3d7e6a4a2f12ec17472f8c5db878a77272685d615002f"} Apr 23 17:58:29.705782 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:29.705744 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nfps4" event={"ID":"ec5a0b72-a897-43b8-a606-7100e1d87870","Type":"ContainerStarted","Data":"a51171c26cb37e5840fd6e34412a377ca971bb832a729625661ff79756346ce6"} Apr 23 17:58:29.723443 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:29.723390 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nfps4" podStartSLOduration=1.165677838 podStartE2EDuration="4.723375026s" podCreationTimestamp="2026-04-23 17:58:25 +0000 UTC" firstStartedPulling="2026-04-23 17:58:25.945038657 +0000 UTC m=+350.760404284" lastFinishedPulling="2026-04-23 17:58:29.502735847 +0000 UTC m=+354.318101472" observedRunningTime="2026-04-23 17:58:29.721384483 +0000 UTC m=+354.536750131" watchObservedRunningTime="2026-04-23 17:58:29.723375026 +0000 UTC m=+354.538740675" Apr 23 17:58:57.972802 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:57.972773 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:58:57.988840 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:57.988797 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:58:58.806894 ip-10-0-130-202 kubenswrapper[2579]: I0423 17:58:58.806865 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 18:00:21.503229 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.503199 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-bk6dd"] Apr 23 18:00:21.506458 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.506441 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:00:21.509205 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.509185 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 18:00:21.509300 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.509203 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 18:00:21.510378 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.510352 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-dmjzp\"" Apr 23 18:00:21.510378 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.510366 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 18:00:21.515133 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.515113 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-bk6dd"] Apr 23 18:00:21.639865 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.639816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-cert\") pod \"kserve-controller-manager-6fc5d867c5-bk6dd\" (UID: \"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7\") " pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:00:21.640016 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.639872 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj2mj\" (UniqueName: \"kubernetes.io/projected/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-kube-api-access-wj2mj\") pod \"kserve-controller-manager-6fc5d867c5-bk6dd\" (UID: \"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7\") " pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:00:21.741004 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.740971 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-cert\") pod \"kserve-controller-manager-6fc5d867c5-bk6dd\" (UID: \"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7\") " pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:00:21.741140 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.741013 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj2mj\" (UniqueName: \"kubernetes.io/projected/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-kube-api-access-wj2mj\") pod \"kserve-controller-manager-6fc5d867c5-bk6dd\" (UID: \"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7\") " pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:00:21.741140 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:00:21.741114 2579 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 23 18:00:21.741212 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:00:21.741187 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-cert podName:cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7 nodeName:}" failed. No retries permitted until 2026-04-23 18:00:22.241169517 +0000 UTC m=+467.056535143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-cert") pod "kserve-controller-manager-6fc5d867c5-bk6dd" (UID: "cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7") : secret "kserve-webhook-server-cert" not found Apr 23 18:00:21.750818 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:21.750790 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj2mj\" (UniqueName: \"kubernetes.io/projected/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-kube-api-access-wj2mj\") pod \"kserve-controller-manager-6fc5d867c5-bk6dd\" (UID: \"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7\") " pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:00:22.245808 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:22.245747 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-cert\") pod \"kserve-controller-manager-6fc5d867c5-bk6dd\" (UID: \"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7\") " pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:00:22.248440 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:22.248406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-cert\") pod \"kserve-controller-manager-6fc5d867c5-bk6dd\" (UID: \"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7\") " pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:00:22.417612 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:22.417573 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:00:22.535960 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:22.535894 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-bk6dd"] Apr 23 18:00:22.538608 ip-10-0-130-202 kubenswrapper[2579]: W0423 18:00:22.538580 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb8691b0_42cc_437d_8fc7_5cff3ffcb8a7.slice/crio-813a537a207f4fa49be1784d2b97dbc382151b119c33e29fefdca9896a948c6b WatchSource:0}: Error finding container 813a537a207f4fa49be1784d2b97dbc382151b119c33e29fefdca9896a948c6b: Status 404 returned error can't find the container with id 813a537a207f4fa49be1784d2b97dbc382151b119c33e29fefdca9896a948c6b Apr 23 18:00:23.042794 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:23.042760 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" event={"ID":"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7","Type":"ContainerStarted","Data":"813a537a207f4fa49be1784d2b97dbc382151b119c33e29fefdca9896a948c6b"} Apr 23 18:00:26.053569 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:26.053533 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" event={"ID":"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7","Type":"ContainerStarted","Data":"fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4"} Apr 23 18:00:26.053990 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:26.053596 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:00:26.072260 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:26.072219 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" podStartSLOduration=2.592498586 podStartE2EDuration="5.072205041s" podCreationTimestamp="2026-04-23 18:00:21 +0000 UTC" firstStartedPulling="2026-04-23 18:00:22.539930947 +0000 UTC m=+467.355296573" lastFinishedPulling="2026-04-23 18:00:25.019637403 +0000 UTC m=+469.835003028" observedRunningTime="2026-04-23 18:00:26.071097069 +0000 UTC m=+470.886462718" watchObservedRunningTime="2026-04-23 18:00:26.072205041 +0000 UTC m=+470.887570689" Apr 23 18:00:57.062427 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:00:57.062398 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:01:14.351496 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.351411 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-bk6dd"] Apr 23 18:01:14.351892 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.351683 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" podUID="cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7" containerName="manager" containerID="cri-o://fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4" gracePeriod=10 Apr 23 18:01:14.377276 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.377243 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-92hgn"] Apr 23 18:01:14.379411 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.379394 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" Apr 23 18:01:14.389699 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.389673 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-92hgn"] Apr 23 18:01:14.496587 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.496545 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75e7b965-7ae8-4e5c-8ef3-e321978caeba-cert\") pod \"kserve-controller-manager-6fc5d867c5-92hgn\" (UID: \"75e7b965-7ae8-4e5c-8ef3-e321978caeba\") " pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" Apr 23 18:01:14.496587 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.496596 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrvjv\" (UniqueName: \"kubernetes.io/projected/75e7b965-7ae8-4e5c-8ef3-e321978caeba-kube-api-access-qrvjv\") pod \"kserve-controller-manager-6fc5d867c5-92hgn\" (UID: \"75e7b965-7ae8-4e5c-8ef3-e321978caeba\") " pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" Apr 23 18:01:14.583688 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.583667 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:01:14.597132 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.597108 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75e7b965-7ae8-4e5c-8ef3-e321978caeba-cert\") pod \"kserve-controller-manager-6fc5d867c5-92hgn\" (UID: \"75e7b965-7ae8-4e5c-8ef3-e321978caeba\") " pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" Apr 23 18:01:14.597238 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.597147 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrvjv\" (UniqueName: \"kubernetes.io/projected/75e7b965-7ae8-4e5c-8ef3-e321978caeba-kube-api-access-qrvjv\") pod \"kserve-controller-manager-6fc5d867c5-92hgn\" (UID: \"75e7b965-7ae8-4e5c-8ef3-e321978caeba\") " pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" Apr 23 18:01:14.599714 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.599692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75e7b965-7ae8-4e5c-8ef3-e321978caeba-cert\") pod \"kserve-controller-manager-6fc5d867c5-92hgn\" (UID: \"75e7b965-7ae8-4e5c-8ef3-e321978caeba\") " pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" Apr 23 18:01:14.606709 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.606646 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrvjv\" (UniqueName: \"kubernetes.io/projected/75e7b965-7ae8-4e5c-8ef3-e321978caeba-kube-api-access-qrvjv\") pod \"kserve-controller-manager-6fc5d867c5-92hgn\" (UID: \"75e7b965-7ae8-4e5c-8ef3-e321978caeba\") " pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" Apr 23 18:01:14.697912 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.697870 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-cert\") pod \"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7\" (UID: \"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7\") " Apr 23 18:01:14.697912 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.697916 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj2mj\" (UniqueName: \"kubernetes.io/projected/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-kube-api-access-wj2mj\") pod \"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7\" (UID: \"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7\") " Apr 23 18:01:14.700181 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.700155 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-cert" (OuterVolumeSpecName: "cert") pod "cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7" (UID: "cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:01:14.700285 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.700208 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-kube-api-access-wj2mj" (OuterVolumeSpecName: "kube-api-access-wj2mj") pod "cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7" (UID: "cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7"). InnerVolumeSpecName "kube-api-access-wj2mj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:01:14.746370 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.746323 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" Apr 23 18:01:14.798628 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.798585 2579 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-cert\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 18:01:14.798628 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.798608 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wj2mj\" (UniqueName: \"kubernetes.io/projected/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7-kube-api-access-wj2mj\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 18:01:14.864324 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:14.864297 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-92hgn"] Apr 23 18:01:14.866684 ip-10-0-130-202 kubenswrapper[2579]: W0423 18:01:14.866657 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75e7b965_7ae8_4e5c_8ef3_e321978caeba.slice/crio-384eef995226231d68adc79830ec367e38ddd036e598a0e09c5b3b0c21714f62 WatchSource:0}: Error finding container 384eef995226231d68adc79830ec367e38ddd036e598a0e09c5b3b0c21714f62: Status 404 returned error can't find the container with id 384eef995226231d68adc79830ec367e38ddd036e598a0e09c5b3b0c21714f62 Apr 23 18:01:15.191599 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:15.191516 2579 generic.go:358] "Generic (PLEG): container finished" podID="cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7" containerID="fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4" exitCode=0 Apr 23 18:01:15.191599 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:15.191574 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" Apr 23 18:01:15.191599 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:15.191588 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" event={"ID":"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7","Type":"ContainerDied","Data":"fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4"} Apr 23 18:01:15.191870 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:15.191628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-bk6dd" event={"ID":"cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7","Type":"ContainerDied","Data":"813a537a207f4fa49be1784d2b97dbc382151b119c33e29fefdca9896a948c6b"} Apr 23 18:01:15.191870 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:15.191650 2579 scope.go:117] "RemoveContainer" containerID="fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4" Apr 23 18:01:15.192724 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:15.192704 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" event={"ID":"75e7b965-7ae8-4e5c-8ef3-e321978caeba","Type":"ContainerStarted","Data":"384eef995226231d68adc79830ec367e38ddd036e598a0e09c5b3b0c21714f62"} Apr 23 18:01:15.199573 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:15.199557 2579 scope.go:117] "RemoveContainer" containerID="fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4" Apr 23 18:01:15.199817 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:01:15.199801 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4\": container with ID starting with fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4 not found: ID does not exist" containerID="fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4" Apr 23 18:01:15.199947 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:15.199849 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4"} err="failed to get container status \"fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4\": rpc error: code = NotFound desc = could not find container \"fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4\": container with ID starting with fffe5023e751f36dcd0e9a54a8f718d57700f4656e022dd7917c6c7a2ddea4f4 not found: ID does not exist" Apr 23 18:01:15.214005 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:15.213983 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-bk6dd"] Apr 23 18:01:15.217624 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:15.217601 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6fc5d867c5-bk6dd"] Apr 23 18:01:15.808263 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:15.804968 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7" path="/var/lib/kubelet/pods/cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7/volumes" Apr 23 18:01:16.197503 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:16.197417 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" event={"ID":"75e7b965-7ae8-4e5c-8ef3-e321978caeba","Type":"ContainerStarted","Data":"a31a13d4ddda6c7c7e0aa933af325539cb8dddbd31fbad47255e45bc1e149b8a"} Apr 23 18:01:16.197666 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:16.197552 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" Apr 23 18:01:16.216460 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:16.216412 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" podStartSLOduration=1.781275355 podStartE2EDuration="2.216396992s" podCreationTimestamp="2026-04-23 18:01:14 +0000 UTC" firstStartedPulling="2026-04-23 18:01:14.86788871 +0000 UTC m=+519.683254337" lastFinishedPulling="2026-04-23 18:01:15.303010345 +0000 UTC m=+520.118375974" observedRunningTime="2026-04-23 18:01:16.214543266 +0000 UTC m=+521.029908924" watchObservedRunningTime="2026-04-23 18:01:16.216396992 +0000 UTC m=+521.031762640" Apr 23 18:01:47.206970 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:47.206936 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6fc5d867c5-92hgn" Apr 23 18:01:48.137383 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.137347 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-m2pk5"] Apr 23 18:01:48.137774 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.137758 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7" containerName="manager" Apr 23 18:01:48.137818 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.137778 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7" containerName="manager" Apr 23 18:01:48.137883 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.137842 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb8691b0-42cc-437d-8fc7-5cff3ffcb8a7" containerName="manager" Apr 23 18:01:48.139523 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.139506 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-m2pk5" Apr 23 18:01:48.142213 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.142194 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 23 18:01:48.142535 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.142520 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-rrk6d\"" Apr 23 18:01:48.149957 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.149934 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-m2pk5"] Apr 23 18:01:48.176131 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.176103 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67qqn\" (UniqueName: \"kubernetes.io/projected/0e1cd28c-87d2-4ce6-9c87-bf429af34772-kube-api-access-67qqn\") pod \"model-serving-api-86f7b4b499-m2pk5\" (UID: \"0e1cd28c-87d2-4ce6-9c87-bf429af34772\") " pod="kserve/model-serving-api-86f7b4b499-m2pk5" Apr 23 18:01:48.176262 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.176162 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1cd28c-87d2-4ce6-9c87-bf429af34772-tls-certs\") pod \"model-serving-api-86f7b4b499-m2pk5\" (UID: \"0e1cd28c-87d2-4ce6-9c87-bf429af34772\") " pod="kserve/model-serving-api-86f7b4b499-m2pk5" Apr 23 18:01:48.276710 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.276668 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67qqn\" (UniqueName: \"kubernetes.io/projected/0e1cd28c-87d2-4ce6-9c87-bf429af34772-kube-api-access-67qqn\") pod \"model-serving-api-86f7b4b499-m2pk5\" (UID: \"0e1cd28c-87d2-4ce6-9c87-bf429af34772\") " pod="kserve/model-serving-api-86f7b4b499-m2pk5" Apr 23 18:01:48.277205 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.276753 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1cd28c-87d2-4ce6-9c87-bf429af34772-tls-certs\") pod \"model-serving-api-86f7b4b499-m2pk5\" (UID: \"0e1cd28c-87d2-4ce6-9c87-bf429af34772\") " pod="kserve/model-serving-api-86f7b4b499-m2pk5" Apr 23 18:01:48.277205 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:01:48.276919 2579 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 23 18:01:48.277205 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:01:48.277000 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e1cd28c-87d2-4ce6-9c87-bf429af34772-tls-certs podName:0e1cd28c-87d2-4ce6-9c87-bf429af34772 nodeName:}" failed. No retries permitted until 2026-04-23 18:01:48.776979148 +0000 UTC m=+553.592344791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/0e1cd28c-87d2-4ce6-9c87-bf429af34772-tls-certs") pod "model-serving-api-86f7b4b499-m2pk5" (UID: "0e1cd28c-87d2-4ce6-9c87-bf429af34772") : secret "model-serving-api-tls" not found Apr 23 18:01:48.288164 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.288138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67qqn\" (UniqueName: \"kubernetes.io/projected/0e1cd28c-87d2-4ce6-9c87-bf429af34772-kube-api-access-67qqn\") pod \"model-serving-api-86f7b4b499-m2pk5\" (UID: \"0e1cd28c-87d2-4ce6-9c87-bf429af34772\") " pod="kserve/model-serving-api-86f7b4b499-m2pk5" Apr 23 18:01:48.781523 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.781472 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1cd28c-87d2-4ce6-9c87-bf429af34772-tls-certs\") pod \"model-serving-api-86f7b4b499-m2pk5\" (UID: \"0e1cd28c-87d2-4ce6-9c87-bf429af34772\") " pod="kserve/model-serving-api-86f7b4b499-m2pk5" Apr 23 18:01:48.784576 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:48.784551 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1cd28c-87d2-4ce6-9c87-bf429af34772-tls-certs\") pod \"model-serving-api-86f7b4b499-m2pk5\" (UID: \"0e1cd28c-87d2-4ce6-9c87-bf429af34772\") " pod="kserve/model-serving-api-86f7b4b499-m2pk5" Apr 23 18:01:49.051337 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:49.051253 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-m2pk5" Apr 23 18:01:49.174519 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:49.174361 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-m2pk5"] Apr 23 18:01:49.177136 ip-10-0-130-202 kubenswrapper[2579]: W0423 18:01:49.177107 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1cd28c_87d2_4ce6_9c87_bf429af34772.slice/crio-6df7ed4f531b127d63e9b707cfcd0ea8dce76aa3c3580e46f5f6abcc423d5876 WatchSource:0}: Error finding container 6df7ed4f531b127d63e9b707cfcd0ea8dce76aa3c3580e46f5f6abcc423d5876: Status 404 returned error can't find the container with id 6df7ed4f531b127d63e9b707cfcd0ea8dce76aa3c3580e46f5f6abcc423d5876 Apr 23 18:01:49.299885 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:49.299849 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-m2pk5" event={"ID":"0e1cd28c-87d2-4ce6-9c87-bf429af34772","Type":"ContainerStarted","Data":"6df7ed4f531b127d63e9b707cfcd0ea8dce76aa3c3580e46f5f6abcc423d5876"} Apr 23 18:01:51.306958 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:51.306924 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-m2pk5" event={"ID":"0e1cd28c-87d2-4ce6-9c87-bf429af34772","Type":"ContainerStarted","Data":"d69182be0fd60a210b7be09a1272aac180b7b7b4ab80c9bf269f0f902853898e"} Apr 23 18:01:51.307341 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:51.307036 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-m2pk5" Apr 23 18:01:51.326168 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:01:51.326118 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-m2pk5" podStartSLOduration=2.177915712 podStartE2EDuration="3.326105151s" podCreationTimestamp="2026-04-23 18:01:48 +0000 UTC" firstStartedPulling="2026-04-23 18:01:49.179031757 +0000 UTC m=+553.994397383" lastFinishedPulling="2026-04-23 18:01:50.327221179 +0000 UTC m=+555.142586822" observedRunningTime="2026-04-23 18:01:51.324691346 +0000 UTC m=+556.140056995" watchObservedRunningTime="2026-04-23 18:01:51.326105151 +0000 UTC m=+556.141470800" Apr 23 18:02:02.314941 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:02.314912 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-m2pk5" Apr 23 18:02:04.128810 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:04.128779 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-d8s5w"] Apr 23 18:02:04.131530 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:04.131513 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-d8s5w" Apr 23 18:02:04.134283 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:04.134261 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-td2rr\"" Apr 23 18:02:04.134397 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:04.134341 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 18:02:04.137636 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:04.137615 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-d8s5w"] Apr 23 18:02:04.210093 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:04.210057 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh79c\" (UniqueName: \"kubernetes.io/projected/623c2c74-3b8f-477a-9d89-a686c3f7d236-kube-api-access-wh79c\") pod \"s3-init-d8s5w\" (UID: \"623c2c74-3b8f-477a-9d89-a686c3f7d236\") " pod="kserve/s3-init-d8s5w" Apr 23 18:02:04.311375 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:04.311338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wh79c\" (UniqueName: \"kubernetes.io/projected/623c2c74-3b8f-477a-9d89-a686c3f7d236-kube-api-access-wh79c\") pod \"s3-init-d8s5w\" (UID: \"623c2c74-3b8f-477a-9d89-a686c3f7d236\") " pod="kserve/s3-init-d8s5w" Apr 23 18:02:04.320496 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:04.320464 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh79c\" (UniqueName: \"kubernetes.io/projected/623c2c74-3b8f-477a-9d89-a686c3f7d236-kube-api-access-wh79c\") pod \"s3-init-d8s5w\" (UID: \"623c2c74-3b8f-477a-9d89-a686c3f7d236\") " pod="kserve/s3-init-d8s5w" Apr 23 18:02:04.457004 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:04.456918 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-d8s5w" Apr 23 18:02:04.576564 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:04.576539 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-d8s5w"] Apr 23 18:02:04.579237 ip-10-0-130-202 kubenswrapper[2579]: W0423 18:02:04.579201 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod623c2c74_3b8f_477a_9d89_a686c3f7d236.slice/crio-37aa5ee38a87b68ab05347af05163916fc17f8aac02f76c6c341e92a7bdbf9fd WatchSource:0}: Error finding container 37aa5ee38a87b68ab05347af05163916fc17f8aac02f76c6c341e92a7bdbf9fd: Status 404 returned error can't find the container with id 37aa5ee38a87b68ab05347af05163916fc17f8aac02f76c6c341e92a7bdbf9fd Apr 23 18:02:05.352434 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:05.352381 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-d8s5w" event={"ID":"623c2c74-3b8f-477a-9d89-a686c3f7d236","Type":"ContainerStarted","Data":"37aa5ee38a87b68ab05347af05163916fc17f8aac02f76c6c341e92a7bdbf9fd"} Apr 23 18:02:09.370103 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:09.370011 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-d8s5w" event={"ID":"623c2c74-3b8f-477a-9d89-a686c3f7d236","Type":"ContainerStarted","Data":"07bb5179ea6d1e4ba7ec90ab86a9c4ae442c9d86d2e92e7c306ac29bf9fe206c"} Apr 23 18:02:09.387511 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:09.387462 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-d8s5w" podStartSLOduration=0.878500701 podStartE2EDuration="5.387449026s" podCreationTimestamp="2026-04-23 18:02:04 +0000 UTC" firstStartedPulling="2026-04-23 18:02:04.580849069 +0000 UTC m=+569.396214695" lastFinishedPulling="2026-04-23 18:02:09.089797378 +0000 UTC m=+573.905163020" observedRunningTime="2026-04-23 18:02:09.386252996 +0000 UTC m=+574.201618644" watchObservedRunningTime="2026-04-23 18:02:09.387449026 +0000 UTC m=+574.202814672" Apr 23 18:02:12.382295 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:12.382210 2579 generic.go:358] "Generic (PLEG): container finished" podID="623c2c74-3b8f-477a-9d89-a686c3f7d236" containerID="07bb5179ea6d1e4ba7ec90ab86a9c4ae442c9d86d2e92e7c306ac29bf9fe206c" exitCode=0 Apr 23 18:02:12.382628 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:12.382286 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-d8s5w" event={"ID":"623c2c74-3b8f-477a-9d89-a686c3f7d236","Type":"ContainerDied","Data":"07bb5179ea6d1e4ba7ec90ab86a9c4ae442c9d86d2e92e7c306ac29bf9fe206c"} Apr 23 18:02:13.506969 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:13.506940 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-d8s5w" Apr 23 18:02:13.597010 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:13.596972 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh79c\" (UniqueName: \"kubernetes.io/projected/623c2c74-3b8f-477a-9d89-a686c3f7d236-kube-api-access-wh79c\") pod \"623c2c74-3b8f-477a-9d89-a686c3f7d236\" (UID: \"623c2c74-3b8f-477a-9d89-a686c3f7d236\") " Apr 23 18:02:13.599216 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:13.599185 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/623c2c74-3b8f-477a-9d89-a686c3f7d236-kube-api-access-wh79c" (OuterVolumeSpecName: "kube-api-access-wh79c") pod "623c2c74-3b8f-477a-9d89-a686c3f7d236" (UID: "623c2c74-3b8f-477a-9d89-a686c3f7d236"). InnerVolumeSpecName "kube-api-access-wh79c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:02:13.698629 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:13.698538 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wh79c\" (UniqueName: \"kubernetes.io/projected/623c2c74-3b8f-477a-9d89-a686c3f7d236-kube-api-access-wh79c\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 18:02:14.389844 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:14.389791 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-d8s5w" Apr 23 18:02:14.389844 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:14.389790 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-d8s5w" event={"ID":"623c2c74-3b8f-477a-9d89-a686c3f7d236","Type":"ContainerDied","Data":"37aa5ee38a87b68ab05347af05163916fc17f8aac02f76c6c341e92a7bdbf9fd"} Apr 23 18:02:14.390057 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:02:14.389867 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37aa5ee38a87b68ab05347af05163916fc17f8aac02f76c6c341e92a7bdbf9fd" Apr 23 18:05:28.639675 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.639636 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh"] Apr 23 18:05:28.640138 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.639992 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="623c2c74-3b8f-477a-9d89-a686c3f7d236" containerName="s3-init" Apr 23 18:05:28.640138 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.640004 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="623c2c74-3b8f-477a-9d89-a686c3f7d236" containerName="s3-init" Apr 23 18:05:28.640138 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.640065 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="623c2c74-3b8f-477a-9d89-a686c3f7d236" containerName="s3-init" Apr 23 18:05:28.645186 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.645152 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:05:28.649216 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.649175 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-73769-kube-rbac-proxy-sar-config\"" Apr 23 18:05:28.649216 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.649208 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9mp96\"" Apr 23 18:05:28.649438 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.649237 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-73769-serving-cert\"" Apr 23 18:05:28.649565 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.649548 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:05:28.649923 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.649903 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh"] Apr 23 18:05:28.724753 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.724718 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2870f85b-d914-487a-bfba-88359ee13c8d-openshift-service-ca-bundle\") pod \"model-chainer-raw-73769-75489dcf9b-h4fqh\" (UID: \"2870f85b-d914-487a-bfba-88359ee13c8d\") " pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:05:28.724753 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.724752 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2870f85b-d914-487a-bfba-88359ee13c8d-proxy-tls\") pod \"model-chainer-raw-73769-75489dcf9b-h4fqh\" (UID: \"2870f85b-d914-487a-bfba-88359ee13c8d\") " pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:05:28.825780 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.825732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2870f85b-d914-487a-bfba-88359ee13c8d-openshift-service-ca-bundle\") pod \"model-chainer-raw-73769-75489dcf9b-h4fqh\" (UID: \"2870f85b-d914-487a-bfba-88359ee13c8d\") " pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:05:28.825780 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.825789 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2870f85b-d914-487a-bfba-88359ee13c8d-proxy-tls\") pod \"model-chainer-raw-73769-75489dcf9b-h4fqh\" (UID: \"2870f85b-d914-487a-bfba-88359ee13c8d\") " pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:05:28.826056 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:05:28.825918 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-73769-serving-cert: secret "model-chainer-raw-73769-serving-cert" not found Apr 23 18:05:28.826056 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:05:28.825992 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2870f85b-d914-487a-bfba-88359ee13c8d-proxy-tls podName:2870f85b-d914-487a-bfba-88359ee13c8d nodeName:}" failed. No retries permitted until 2026-04-23 18:05:29.325969249 +0000 UTC m=+774.141334876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2870f85b-d914-487a-bfba-88359ee13c8d-proxy-tls") pod "model-chainer-raw-73769-75489dcf9b-h4fqh" (UID: "2870f85b-d914-487a-bfba-88359ee13c8d") : secret "model-chainer-raw-73769-serving-cert" not found Apr 23 18:05:28.826401 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:28.826381 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2870f85b-d914-487a-bfba-88359ee13c8d-openshift-service-ca-bundle\") pod \"model-chainer-raw-73769-75489dcf9b-h4fqh\" (UID: \"2870f85b-d914-487a-bfba-88359ee13c8d\") " pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:05:29.329299 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:29.329262 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2870f85b-d914-487a-bfba-88359ee13c8d-proxy-tls\") pod \"model-chainer-raw-73769-75489dcf9b-h4fqh\" (UID: \"2870f85b-d914-487a-bfba-88359ee13c8d\") " pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:05:29.331762 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:29.331739 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2870f85b-d914-487a-bfba-88359ee13c8d-proxy-tls\") pod \"model-chainer-raw-73769-75489dcf9b-h4fqh\" (UID: \"2870f85b-d914-487a-bfba-88359ee13c8d\") " pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:05:29.556957 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:29.556900 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:05:29.676008 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:29.675985 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh"] Apr 23 18:05:29.678548 ip-10-0-130-202 kubenswrapper[2579]: W0423 18:05:29.678521 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2870f85b_d914_487a_bfba_88359ee13c8d.slice/crio-f70d1fb97ffe4f7c83eced710f2aae22b742ff99706e2b74f0e4fb2bcb65df5a WatchSource:0}: Error finding container f70d1fb97ffe4f7c83eced710f2aae22b742ff99706e2b74f0e4fb2bcb65df5a: Status 404 returned error can't find the container with id f70d1fb97ffe4f7c83eced710f2aae22b742ff99706e2b74f0e4fb2bcb65df5a Apr 23 18:05:29.680596 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:29.680578 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:05:29.969004 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:29.968915 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" event={"ID":"2870f85b-d914-487a-bfba-88359ee13c8d","Type":"ContainerStarted","Data":"f70d1fb97ffe4f7c83eced710f2aae22b742ff99706e2b74f0e4fb2bcb65df5a"} Apr 23 18:05:32.980650 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:32.980618 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" event={"ID":"2870f85b-d914-487a-bfba-88359ee13c8d","Type":"ContainerStarted","Data":"0faf84460d229c27171c5009fedfd26b54c6c67d89215d5cbe3bca9bfc11857b"} Apr 23 18:05:32.981033 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:32.980666 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:05:33.010017 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:33.009957 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" podStartSLOduration=2.232291676 podStartE2EDuration="5.009938128s" podCreationTimestamp="2026-04-23 18:05:28 +0000 UTC" firstStartedPulling="2026-04-23 18:05:29.680756243 +0000 UTC m=+774.496121875" lastFinishedPulling="2026-04-23 18:05:32.458402688 +0000 UTC m=+777.273768327" observedRunningTime="2026-04-23 18:05:33.008619083 +0000 UTC m=+777.823984731" watchObservedRunningTime="2026-04-23 18:05:33.009938128 +0000 UTC m=+777.825303777" Apr 23 18:05:38.691945 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:38.691912 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh"] Apr 23 18:05:38.692330 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:38.692153 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" containerName="model-chainer-raw-73769" containerID="cri-o://0faf84460d229c27171c5009fedfd26b54c6c67d89215d5cbe3bca9bfc11857b" gracePeriod=30 Apr 23 18:05:38.699411 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:38.699382 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" containerName="model-chainer-raw-73769" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:05:43.696310 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:43.696264 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" containerName="model-chainer-raw-73769" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:05:48.696218 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:48.696134 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" containerName="model-chainer-raw-73769" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:05:53.696109 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:53.696072 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" containerName="model-chainer-raw-73769" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:05:58.696552 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:05:58.696513 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" containerName="model-chainer-raw-73769" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:06:03.696233 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:03.696190 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" containerName="model-chainer-raw-73769" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:06:08.696798 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:08.696757 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" containerName="model-chainer-raw-73769" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:06:09.091018 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:09.090986 2579 generic.go:358] "Generic (PLEG): container finished" podID="2870f85b-d914-487a-bfba-88359ee13c8d" containerID="0faf84460d229c27171c5009fedfd26b54c6c67d89215d5cbe3bca9bfc11857b" exitCode=0 Apr 23 18:06:09.091193 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:09.091043 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" event={"ID":"2870f85b-d914-487a-bfba-88359ee13c8d","Type":"ContainerDied","Data":"0faf84460d229c27171c5009fedfd26b54c6c67d89215d5cbe3bca9bfc11857b"} Apr 23 18:06:09.338367 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:09.338344 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:06:09.472889 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:09.472800 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2870f85b-d914-487a-bfba-88359ee13c8d-openshift-service-ca-bundle\") pod \"2870f85b-d914-487a-bfba-88359ee13c8d\" (UID: \"2870f85b-d914-487a-bfba-88359ee13c8d\") " Apr 23 18:06:09.473046 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:09.472918 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2870f85b-d914-487a-bfba-88359ee13c8d-proxy-tls\") pod \"2870f85b-d914-487a-bfba-88359ee13c8d\" (UID: \"2870f85b-d914-487a-bfba-88359ee13c8d\") " Apr 23 18:06:09.473198 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:09.473174 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2870f85b-d914-487a-bfba-88359ee13c8d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2870f85b-d914-487a-bfba-88359ee13c8d" (UID: "2870f85b-d914-487a-bfba-88359ee13c8d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:06:09.475248 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:09.475224 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2870f85b-d914-487a-bfba-88359ee13c8d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2870f85b-d914-487a-bfba-88359ee13c8d" (UID: "2870f85b-d914-487a-bfba-88359ee13c8d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:06:09.574662 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:09.574625 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2870f85b-d914-487a-bfba-88359ee13c8d-proxy-tls\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 18:06:09.574662 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:09.574655 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2870f85b-d914-487a-bfba-88359ee13c8d-openshift-service-ca-bundle\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 18:06:10.095475 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:10.095390 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" event={"ID":"2870f85b-d914-487a-bfba-88359ee13c8d","Type":"ContainerDied","Data":"f70d1fb97ffe4f7c83eced710f2aae22b742ff99706e2b74f0e4fb2bcb65df5a"} Apr 23 18:06:10.095475 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:10.095432 2579 scope.go:117] "RemoveContainer" containerID="0faf84460d229c27171c5009fedfd26b54c6c67d89215d5cbe3bca9bfc11857b" Apr 23 18:06:10.095475 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:10.095434 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh" Apr 23 18:06:10.118877 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:10.118848 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh"] Apr 23 18:06:10.123135 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:10.123107 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-73769-75489dcf9b-h4fqh"] Apr 23 18:06:11.802342 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:06:11.802302 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" path="/var/lib/kubelet/pods/2870f85b-d914-487a-bfba-88359ee13c8d/volumes" Apr 23 18:07:08.942833 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.942793 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f"] Apr 23 18:07:08.943272 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.943153 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" containerName="model-chainer-raw-73769" Apr 23 18:07:08.943272 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.943165 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" containerName="model-chainer-raw-73769" Apr 23 18:07:08.943272 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.943224 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2870f85b-d914-487a-bfba-88359ee13c8d" containerName="model-chainer-raw-73769" Apr 23 18:07:08.946253 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.946231 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:08.949087 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.949063 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-050b4-serving-cert\"" Apr 23 18:07:08.949214 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.949094 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 18:07:08.949267 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.949246 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-9mp96\"" Apr 23 18:07:08.949320 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.949305 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-050b4-kube-rbac-proxy-sar-config\"" Apr 23 18:07:08.954698 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.954673 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f"] Apr 23 18:07:08.988465 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.988434 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-050b4-56df766c48-ckj2f\" (UID: \"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:08.988618 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:08.988484 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-proxy-tls\") pod \"model-chainer-raw-hpa-050b4-56df766c48-ckj2f\" (UID: \"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:09.089404 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:09.089374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-050b4-56df766c48-ckj2f\" (UID: \"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:09.089555 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:09.089413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-proxy-tls\") pod \"model-chainer-raw-hpa-050b4-56df766c48-ckj2f\" (UID: \"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:09.089555 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:07:09.089546 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-serving-cert: secret "model-chainer-raw-hpa-050b4-serving-cert" not found Apr 23 18:07:09.089629 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:07:09.089597 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-proxy-tls podName:6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94 nodeName:}" failed. No retries permitted until 2026-04-23 18:07:09.589580663 +0000 UTC m=+874.404946289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-proxy-tls") pod "model-chainer-raw-hpa-050b4-56df766c48-ckj2f" (UID: "6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94") : secret "model-chainer-raw-hpa-050b4-serving-cert" not found Apr 23 18:07:09.090072 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:09.090044 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-050b4-56df766c48-ckj2f\" (UID: \"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:09.594107 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:09.594043 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-proxy-tls\") pod \"model-chainer-raw-hpa-050b4-56df766c48-ckj2f\" (UID: \"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:09.596599 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:09.596574 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-proxy-tls\") pod \"model-chainer-raw-hpa-050b4-56df766c48-ckj2f\" (UID: \"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:09.877416 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:09.877322 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:10.000485 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:10.000453 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f"] Apr 23 18:07:10.003719 ip-10-0-130-202 kubenswrapper[2579]: W0423 18:07:10.003684 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dcc1f0c_6923_4d2d_a878_33a4ba3f6d94.slice/crio-e7043ce87cbd14806a5a99c38313ed6553174acda81eb1649ae8f7a941fc0da8 WatchSource:0}: Error finding container e7043ce87cbd14806a5a99c38313ed6553174acda81eb1649ae8f7a941fc0da8: Status 404 returned error can't find the container with id e7043ce87cbd14806a5a99c38313ed6553174acda81eb1649ae8f7a941fc0da8 Apr 23 18:07:10.277632 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:10.277600 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" event={"ID":"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94","Type":"ContainerStarted","Data":"5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a"} Apr 23 18:07:10.277632 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:10.277635 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" event={"ID":"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94","Type":"ContainerStarted","Data":"e7043ce87cbd14806a5a99c38313ed6553174acda81eb1649ae8f7a941fc0da8"} Apr 23 18:07:10.277855 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:10.277743 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:10.295880 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:10.295819 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" podStartSLOduration=2.295804595 podStartE2EDuration="2.295804595s" podCreationTimestamp="2026-04-23 18:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:07:10.294260912 +0000 UTC m=+875.109626571" watchObservedRunningTime="2026-04-23 18:07:10.295804595 +0000 UTC m=+875.111170256" Apr 23 18:07:16.286629 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:16.286553 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:19.011869 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:19.011818 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f"] Apr 23 18:07:19.012257 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:19.012105 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" podUID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" containerName="model-chainer-raw-hpa-050b4" containerID="cri-o://5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a" gracePeriod=30 Apr 23 18:07:21.285157 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:21.285120 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" podUID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" containerName="model-chainer-raw-hpa-050b4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:26.285190 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:26.285150 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" podUID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" containerName="model-chainer-raw-hpa-050b4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:31.284470 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:31.284429 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" podUID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" containerName="model-chainer-raw-hpa-050b4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:31.284953 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:31.284544 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:36.284341 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:36.284304 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" podUID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" containerName="model-chainer-raw-hpa-050b4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:41.285642 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:41.285591 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" podUID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" containerName="model-chainer-raw-hpa-050b4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:46.284889 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:46.284848 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" podUID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" containerName="model-chainer-raw-hpa-050b4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 18:07:49.041322 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:07:49.041286 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dcc1f0c_6923_4d2d_a878_33a4ba3f6d94.slice/crio-conmon-5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a.scope\": RecentStats: unable to find data in memory cache]" Apr 23 18:07:49.041750 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:07:49.041294 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dcc1f0c_6923_4d2d_a878_33a4ba3f6d94.slice/crio-conmon-5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a.scope\": RecentStats: unable to find data in memory cache]" Apr 23 18:07:49.151631 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.151607 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:49.237321 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.237289 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-openshift-service-ca-bundle\") pod \"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94\" (UID: \"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94\") " Apr 23 18:07:49.237475 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.237353 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-proxy-tls\") pod \"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94\" (UID: \"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94\") " Apr 23 18:07:49.237681 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.237658 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" (UID: "6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 18:07:49.239484 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.239458 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" (UID: "6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 18:07:49.338150 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.338077 2579 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-openshift-service-ca-bundle\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 18:07:49.338150 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.338106 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94-proxy-tls\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 18:07:49.390881 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.390849 2579 generic.go:358] "Generic (PLEG): container finished" podID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" containerID="5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a" exitCode=0 Apr 23 18:07:49.391018 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.390930 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" Apr 23 18:07:49.391018 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.390934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" event={"ID":"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94","Type":"ContainerDied","Data":"5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a"} Apr 23 18:07:49.391018 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.390975 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f" event={"ID":"6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94","Type":"ContainerDied","Data":"e7043ce87cbd14806a5a99c38313ed6553174acda81eb1649ae8f7a941fc0da8"} Apr 23 18:07:49.391018 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.390991 2579 scope.go:117] "RemoveContainer" containerID="5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a" Apr 23 18:07:49.399330 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.399315 2579 scope.go:117] "RemoveContainer" containerID="5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a" Apr 23 18:07:49.399570 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:07:49.399549 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a\": container with ID starting with 5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a not found: ID does not exist" containerID="5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a" Apr 23 18:07:49.399632 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.399579 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a"} err="failed to get container status \"5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a\": rpc error: code = NotFound desc = could not find container \"5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a\": container with ID starting with 5b35f7c3d38d53239416fed33b56a5839397f2405270426d88b7a42a25ab109a not found: ID does not exist" Apr 23 18:07:49.411508 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.411481 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f"] Apr 23 18:07:49.415172 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.415153 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-050b4-56df766c48-ckj2f"] Apr 23 18:07:49.803725 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:07:49.803685 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" path="/var/lib/kubelet/pods/6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94/volumes" Apr 23 18:15:52.393476 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.393442 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qb7qd/must-gather-dg8w5"] Apr 23 18:15:52.393960 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.393782 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" containerName="model-chainer-raw-hpa-050b4" Apr 23 18:15:52.393960 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.393793 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" containerName="model-chainer-raw-hpa-050b4" Apr 23 18:15:52.393960 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.393870 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6dcc1f0c-6923-4d2d-a878-33a4ba3f6d94" containerName="model-chainer-raw-hpa-050b4" Apr 23 18:15:52.396798 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.396782 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb7qd/must-gather-dg8w5" Apr 23 18:15:52.399729 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.399707 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qb7qd\"/\"default-dockercfg-5bxrj\"" Apr 23 18:15:52.399729 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.399715 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qb7qd\"/\"openshift-service-ca.crt\"" Apr 23 18:15:52.401262 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.401246 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qb7qd\"/\"kube-root-ca.crt\"" Apr 23 18:15:52.404997 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.404957 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qb7qd/must-gather-dg8w5"] Apr 23 18:15:52.449381 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.449355 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-must-gather-output\") pod \"must-gather-dg8w5\" (UID: \"07ed624c-30a2-44fe-bfba-f3b6d5ef224e\") " pod="openshift-must-gather-qb7qd/must-gather-dg8w5" Apr 23 18:15:52.449520 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.449409 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5xpf\" (UniqueName: \"kubernetes.io/projected/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-kube-api-access-t5xpf\") pod \"must-gather-dg8w5\" (UID: \"07ed624c-30a2-44fe-bfba-f3b6d5ef224e\") " pod="openshift-must-gather-qb7qd/must-gather-dg8w5" Apr 23 18:15:52.550414 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.550382 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5xpf\" (UniqueName: \"kubernetes.io/projected/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-kube-api-access-t5xpf\") pod \"must-gather-dg8w5\" (UID: \"07ed624c-30a2-44fe-bfba-f3b6d5ef224e\") " pod="openshift-must-gather-qb7qd/must-gather-dg8w5" Apr 23 18:15:52.550562 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.550449 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-must-gather-output\") pod \"must-gather-dg8w5\" (UID: \"07ed624c-30a2-44fe-bfba-f3b6d5ef224e\") " pod="openshift-must-gather-qb7qd/must-gather-dg8w5" Apr 23 18:15:52.550751 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.550735 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-must-gather-output\") pod \"must-gather-dg8w5\" (UID: \"07ed624c-30a2-44fe-bfba-f3b6d5ef224e\") " pod="openshift-must-gather-qb7qd/must-gather-dg8w5" Apr 23 18:15:52.560076 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.560045 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5xpf\" (UniqueName: \"kubernetes.io/projected/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-kube-api-access-t5xpf\") pod \"must-gather-dg8w5\" (UID: \"07ed624c-30a2-44fe-bfba-f3b6d5ef224e\") " pod="openshift-must-gather-qb7qd/must-gather-dg8w5" Apr 23 18:15:52.715223 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.715147 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb7qd/must-gather-dg8w5" Apr 23 18:15:52.844351 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.844299 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qb7qd/must-gather-dg8w5"] Apr 23 18:15:52.847051 ip-10-0-130-202 kubenswrapper[2579]: W0423 18:15:52.847023 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07ed624c_30a2_44fe_bfba_f3b6d5ef224e.slice/crio-618668b2adb3243ae99594f7eb1be6509761a052f2eaa553f7da20999b47b8d8 WatchSource:0}: Error finding container 618668b2adb3243ae99594f7eb1be6509761a052f2eaa553f7da20999b47b8d8: Status 404 returned error can't find the container with id 618668b2adb3243ae99594f7eb1be6509761a052f2eaa553f7da20999b47b8d8 Apr 23 18:15:52.848693 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:52.848677 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:15:53.843535 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:53.843499 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb7qd/must-gather-dg8w5" event={"ID":"07ed624c-30a2-44fe-bfba-f3b6d5ef224e","Type":"ContainerStarted","Data":"618668b2adb3243ae99594f7eb1be6509761a052f2eaa553f7da20999b47b8d8"} Apr 23 18:15:57.859632 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:57.859596 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb7qd/must-gather-dg8w5" event={"ID":"07ed624c-30a2-44fe-bfba-f3b6d5ef224e","Type":"ContainerStarted","Data":"3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef"} Apr 23 18:15:57.860102 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:57.859639 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb7qd/must-gather-dg8w5" event={"ID":"07ed624c-30a2-44fe-bfba-f3b6d5ef224e","Type":"ContainerStarted","Data":"4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769"} Apr 23 18:15:57.876538 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:15:57.876490 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qb7qd/must-gather-dg8w5" podStartSLOduration=1.5086178879999999 podStartE2EDuration="5.876474643s" podCreationTimestamp="2026-04-23 18:15:52 +0000 UTC" firstStartedPulling="2026-04-23 18:15:52.848795996 +0000 UTC m=+1397.664161625" lastFinishedPulling="2026-04-23 18:15:57.216652751 +0000 UTC m=+1402.032018380" observedRunningTime="2026-04-23 18:15:57.875052368 +0000 UTC m=+1402.690418052" watchObservedRunningTime="2026-04-23 18:15:57.876474643 +0000 UTC m=+1402.691840291" Apr 23 18:16:15.917379 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:15.917348 2579 generic.go:358] "Generic (PLEG): container finished" podID="07ed624c-30a2-44fe-bfba-f3b6d5ef224e" containerID="4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769" exitCode=0 Apr 23 18:16:15.917807 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:15.917427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb7qd/must-gather-dg8w5" event={"ID":"07ed624c-30a2-44fe-bfba-f3b6d5ef224e","Type":"ContainerDied","Data":"4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769"} Apr 23 18:16:15.917807 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:15.917764 2579 scope.go:117] "RemoveContainer" containerID="4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769" Apr 23 18:16:16.219530 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:16.219452 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qb7qd_must-gather-dg8w5_07ed624c-30a2-44fe-bfba-f3b6d5ef224e/gather/0.log" Apr 23 18:16:19.753600 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:19.753564 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nfps4_ec5a0b72-a897-43b8-a606-7100e1d87870/global-pull-secret-syncer/0.log" Apr 23 18:16:19.866498 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:19.866467 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-84tsb_a1cf0add-6b75-4312-8f16-a023261bbef3/konnectivity-agent/0.log" Apr 23 18:16:19.950811 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:19.950782 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-202.ec2.internal_5920d41245667f30b303f7ab0e3ca5d3/haproxy/0.log" Apr 23 18:16:21.617624 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.617589 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qb7qd/must-gather-dg8w5"] Apr 23 18:16:21.618063 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.617812 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-qb7qd/must-gather-dg8w5" podUID="07ed624c-30a2-44fe-bfba-f3b6d5ef224e" containerName="copy" containerID="cri-o://3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef" gracePeriod=2 Apr 23 18:16:21.624278 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.624252 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qb7qd/must-gather-dg8w5"] Apr 23 18:16:21.845489 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.845466 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qb7qd_must-gather-dg8w5_07ed624c-30a2-44fe-bfba-f3b6d5ef224e/copy/0.log" Apr 23 18:16:21.845839 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.845805 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb7qd/must-gather-dg8w5" Apr 23 18:16:21.916720 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.916659 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5xpf\" (UniqueName: \"kubernetes.io/projected/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-kube-api-access-t5xpf\") pod \"07ed624c-30a2-44fe-bfba-f3b6d5ef224e\" (UID: \"07ed624c-30a2-44fe-bfba-f3b6d5ef224e\") " Apr 23 18:16:21.916720 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.916714 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-must-gather-output\") pod \"07ed624c-30a2-44fe-bfba-f3b6d5ef224e\" (UID: \"07ed624c-30a2-44fe-bfba-f3b6d5ef224e\") " Apr 23 18:16:21.918055 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.918033 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "07ed624c-30a2-44fe-bfba-f3b6d5ef224e" (UID: "07ed624c-30a2-44fe-bfba-f3b6d5ef224e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 18:16:21.918895 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.918873 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-kube-api-access-t5xpf" (OuterVolumeSpecName: "kube-api-access-t5xpf") pod "07ed624c-30a2-44fe-bfba-f3b6d5ef224e" (UID: "07ed624c-30a2-44fe-bfba-f3b6d5ef224e"). InnerVolumeSpecName "kube-api-access-t5xpf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 18:16:21.939399 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.939376 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qb7qd_must-gather-dg8w5_07ed624c-30a2-44fe-bfba-f3b6d5ef224e/copy/0.log" Apr 23 18:16:21.939658 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.939640 2579 generic.go:358] "Generic (PLEG): container finished" podID="07ed624c-30a2-44fe-bfba-f3b6d5ef224e" containerID="3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef" exitCode=143 Apr 23 18:16:21.939731 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.939692 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb7qd/must-gather-dg8w5" Apr 23 18:16:21.939777 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.939694 2579 scope.go:117] "RemoveContainer" containerID="3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef" Apr 23 18:16:21.947184 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.947166 2579 scope.go:117] "RemoveContainer" containerID="4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769" Apr 23 18:16:21.957959 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.957944 2579 scope.go:117] "RemoveContainer" containerID="3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef" Apr 23 18:16:21.958216 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:16:21.958196 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef\": container with ID starting with 3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef not found: ID does not exist" containerID="3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef" Apr 23 18:16:21.958299 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.958227 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef"} err="failed to get container status \"3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef\": rpc error: code = NotFound desc = could not find container \"3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef\": container with ID starting with 3c56112c5cbb0427bfab0c62e8d6c65e43847964dec0b57aedfff942f6040bef not found: ID does not exist" Apr 23 18:16:21.958299 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.958251 2579 scope.go:117] "RemoveContainer" containerID="4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769" Apr 23 18:16:21.958497 ip-10-0-130-202 kubenswrapper[2579]: E0423 18:16:21.958478 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769\": container with ID starting with 4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769 not found: ID does not exist" containerID="4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769" Apr 23 18:16:21.958536 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:21.958504 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769"} err="failed to get container status \"4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769\": rpc error: code = NotFound desc = could not find container \"4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769\": container with ID starting with 4db2c2d807e52605bb3d48fe7092c8d02441c5db954cce95eccf1a16536c9769 not found: ID does not exist" Apr 23 18:16:22.017962 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:22.017938 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t5xpf\" (UniqueName: \"kubernetes.io/projected/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-kube-api-access-t5xpf\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 18:16:22.017962 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:22.017960 2579 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/07ed624c-30a2-44fe-bfba-f3b6d5ef224e-must-gather-output\") on node \"ip-10-0-130-202.ec2.internal\" DevicePath \"\"" Apr 23 18:16:23.189042 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.189017 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03db3ae9-3bb0-4846-be9b-3e502d6af2ea/alertmanager/0.log" Apr 23 18:16:23.214510 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.214487 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03db3ae9-3bb0-4846-be9b-3e502d6af2ea/config-reloader/0.log" Apr 23 18:16:23.243039 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.243014 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03db3ae9-3bb0-4846-be9b-3e502d6af2ea/kube-rbac-proxy-web/0.log" Apr 23 18:16:23.269535 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.269509 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03db3ae9-3bb0-4846-be9b-3e502d6af2ea/kube-rbac-proxy/0.log" Apr 23 18:16:23.295241 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.295217 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03db3ae9-3bb0-4846-be9b-3e502d6af2ea/kube-rbac-proxy-metric/0.log" Apr 23 18:16:23.319328 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.319303 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03db3ae9-3bb0-4846-be9b-3e502d6af2ea/prom-label-proxy/0.log" Apr 23 18:16:23.344741 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.344712 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03db3ae9-3bb0-4846-be9b-3e502d6af2ea/init-config-reloader/0.log" Apr 23 18:16:23.408888 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.408854 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-fz4sr_1f3971d2-59c6-44bc-b53a-c1f78a8b154e/kube-state-metrics/0.log" Apr 23 18:16:23.431996 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.431926 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-fz4sr_1f3971d2-59c6-44bc-b53a-c1f78a8b154e/kube-rbac-proxy-main/0.log" Apr 23 18:16:23.457102 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.457080 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-fz4sr_1f3971d2-59c6-44bc-b53a-c1f78a8b154e/kube-rbac-proxy-self/0.log" Apr 23 18:16:23.563407 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.563385 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4926f_a8df34d0-5aac-4603-b040-3396c2646e7a/node-exporter/0.log" Apr 23 18:16:23.635243 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.635217 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4926f_a8df34d0-5aac-4603-b040-3396c2646e7a/kube-rbac-proxy/0.log" Apr 23 18:16:23.663154 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.663135 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4926f_a8df34d0-5aac-4603-b040-3396c2646e7a/init-textfile/0.log" Apr 23 18:16:23.804123 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.804092 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07ed624c-30a2-44fe-bfba-f3b6d5ef224e" path="/var/lib/kubelet/pods/07ed624c-30a2-44fe-bfba-f3b6d5ef224e/volumes" Apr 23 18:16:23.854137 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.854107 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hhctl_e11ff4c9-1509-4a31-a2de-6dd234e3cd0c/kube-rbac-proxy-main/0.log" Apr 23 18:16:23.879216 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.879188 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hhctl_e11ff4c9-1509-4a31-a2de-6dd234e3cd0c/kube-rbac-proxy-self/0.log" Apr 23 18:16:23.901310 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.901282 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hhctl_e11ff4c9-1509-4a31-a2de-6dd234e3cd0c/openshift-state-metrics/0.log" Apr 23 18:16:23.940509 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.940481 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2079a3b2-c08a-44a6-a451-96a54393ff3b/prometheus/0.log" Apr 23 18:16:23.961904 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.961884 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2079a3b2-c08a-44a6-a451-96a54393ff3b/config-reloader/0.log" Apr 23 18:16:23.986381 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:23.986358 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2079a3b2-c08a-44a6-a451-96a54393ff3b/thanos-sidecar/0.log" Apr 23 18:16:24.011062 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.011033 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2079a3b2-c08a-44a6-a451-96a54393ff3b/kube-rbac-proxy-web/0.log" Apr 23 18:16:24.036903 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.036884 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2079a3b2-c08a-44a6-a451-96a54393ff3b/kube-rbac-proxy/0.log" Apr 23 18:16:24.062249 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.062197 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2079a3b2-c08a-44a6-a451-96a54393ff3b/kube-rbac-proxy-thanos/0.log" Apr 23 18:16:24.089884 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.089861 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2079a3b2-c08a-44a6-a451-96a54393ff3b/init-config-reloader/0.log" Apr 23 18:16:24.120896 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.120865 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jtmbp_0448d038-a900-4370-b832-441691c982c3/prometheus-operator/0.log" Apr 23 18:16:24.140131 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.140112 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jtmbp_0448d038-a900-4370-b832-441691c982c3/kube-rbac-proxy/0.log" Apr 23 18:16:24.168097 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.168068 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-t6rbc_49a66fbd-c07c-463a-afd9-fd6e00f9fbbb/prometheus-operator-admission-webhook/0.log" Apr 23 18:16:24.217513 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.217463 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5dd6cfc8f5-9ll9g_1048dfda-10d2-4413-b065-476808e431a8/telemeter-client/0.log" Apr 23 18:16:24.249050 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.249027 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5dd6cfc8f5-9ll9g_1048dfda-10d2-4413-b065-476808e431a8/reload/0.log" Apr 23 18:16:24.274364 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.274329 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5dd6cfc8f5-9ll9g_1048dfda-10d2-4413-b065-476808e431a8/kube-rbac-proxy/0.log" Apr 23 18:16:24.314875 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.314792 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65b888866-skbbk_8860c6ed-5eaa-426d-bf4c-421a54cda563/thanos-query/0.log" Apr 23 18:16:24.346625 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.346600 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65b888866-skbbk_8860c6ed-5eaa-426d-bf4c-421a54cda563/kube-rbac-proxy-web/0.log" Apr 23 18:16:24.373428 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.373408 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65b888866-skbbk_8860c6ed-5eaa-426d-bf4c-421a54cda563/kube-rbac-proxy/0.log" Apr 23 18:16:24.404105 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.404079 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65b888866-skbbk_8860c6ed-5eaa-426d-bf4c-421a54cda563/prom-label-proxy/0.log" Apr 23 18:16:24.431026 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.431001 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65b888866-skbbk_8860c6ed-5eaa-426d-bf4c-421a54cda563/kube-rbac-proxy-rules/0.log" Apr 23 18:16:24.458344 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:24.458317 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65b888866-skbbk_8860c6ed-5eaa-426d-bf4c-421a54cda563/kube-rbac-proxy-metrics/0.log" Apr 23 18:16:26.597229 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.597195 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t"] Apr 23 18:16:26.597597 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.597552 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07ed624c-30a2-44fe-bfba-f3b6d5ef224e" containerName="copy" Apr 23 18:16:26.597597 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.597563 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ed624c-30a2-44fe-bfba-f3b6d5ef224e" containerName="copy" Apr 23 18:16:26.597597 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.597577 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07ed624c-30a2-44fe-bfba-f3b6d5ef224e" containerName="gather" Apr 23 18:16:26.597597 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.597582 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ed624c-30a2-44fe-bfba-f3b6d5ef224e" containerName="gather" Apr 23 18:16:26.597725 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.597629 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="07ed624c-30a2-44fe-bfba-f3b6d5ef224e" containerName="copy" Apr 23 18:16:26.597725 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.597637 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="07ed624c-30a2-44fe-bfba-f3b6d5ef224e" containerName="gather" Apr 23 18:16:26.602562 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.602542 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.605523 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.605500 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-46lhs\"/\"openshift-service-ca.crt\"" Apr 23 18:16:26.605523 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.605501 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-46lhs\"/\"default-dockercfg-g7l5f\"" Apr 23 18:16:26.606772 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.606755 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-46lhs\"/\"kube-root-ca.crt\"" Apr 23 18:16:26.610402 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.610379 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t"] Apr 23 18:16:26.755131 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.755095 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-podres\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.755314 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.755156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-sys\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.755314 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.755183 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-lib-modules\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.755314 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.755219 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-proc\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.755314 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.755234 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rltwm\" (UniqueName: \"kubernetes.io/projected/8d3aedfc-b848-4226-830a-284509223248-kube-api-access-rltwm\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.856025 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.855922 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-proc\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.856025 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.855978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rltwm\" (UniqueName: \"kubernetes.io/projected/8d3aedfc-b848-4226-830a-284509223248-kube-api-access-rltwm\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.856025 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.856034 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-podres\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.856248 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.856061 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-proc\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.856248 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.856076 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-sys\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.856248 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.856102 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-lib-modules\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.856248 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.856135 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-sys\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.856248 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.856201 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-podres\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.856248 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.856205 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d3aedfc-b848-4226-830a-284509223248-lib-modules\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.864846 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.864790 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rltwm\" (UniqueName: \"kubernetes.io/projected/8d3aedfc-b848-4226-830a-284509223248-kube-api-access-rltwm\") pod \"perf-node-gather-daemonset-bhh7t\" (UID: \"8d3aedfc-b848-4226-830a-284509223248\") " pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:26.912986 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:26.912952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:27.036196 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:27.036169 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t"] Apr 23 18:16:27.038585 ip-10-0-130-202 kubenswrapper[2579]: W0423 18:16:27.038543 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8d3aedfc_b848_4226_830a_284509223248.slice/crio-eed10c605383e53b98c83ea3f040030252f9f2efca2c685479d6607175cdb8a6 WatchSource:0}: Error finding container eed10c605383e53b98c83ea3f040030252f9f2efca2c685479d6607175cdb8a6: Status 404 returned error can't find the container with id eed10c605383e53b98c83ea3f040030252f9f2efca2c685479d6607175cdb8a6 Apr 23 18:16:27.548904 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:27.548864 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nq87q_af20cd75-f1bb-4dcd-b179-14bcb34e5ef1/dns/0.log" Apr 23 18:16:27.571224 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:27.571193 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nq87q_af20cd75-f1bb-4dcd-b179-14bcb34e5ef1/kube-rbac-proxy/0.log" Apr 23 18:16:27.595309 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:27.595269 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c9dz6_47e47db9-35db-473d-b6e8-181ec486420c/dns-node-resolver/0.log" Apr 23 18:16:27.961371 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:27.961278 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" event={"ID":"8d3aedfc-b848-4226-830a-284509223248","Type":"ContainerStarted","Data":"645d935ed166c5008614c650c3892b3265f7febc4bef42fc731dfb64cda13939"} Apr 23 18:16:27.961371 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:27.961314 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" event={"ID":"8d3aedfc-b848-4226-830a-284509223248","Type":"ContainerStarted","Data":"eed10c605383e53b98c83ea3f040030252f9f2efca2c685479d6607175cdb8a6"} Apr 23 18:16:27.961897 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:27.961403 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:27.978728 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:27.978679 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" podStartSLOduration=1.978663088 podStartE2EDuration="1.978663088s" podCreationTimestamp="2026-04-23 18:16:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:16:27.977608114 +0000 UTC m=+1432.792973773" watchObservedRunningTime="2026-04-23 18:16:27.978663088 +0000 UTC m=+1432.794028735" Apr 23 18:16:28.114531 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:28.114500 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jbbn4_b6c26cb5-e282-4840-a6ae-c60523f49733/node-ca/0.log" Apr 23 18:16:29.288322 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:29.288285 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qnq92_676e7d7a-e558-49c3-bc63-788e4d3f9a19/serve-healthcheck-canary/0.log" Apr 23 18:16:29.729914 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:29.729808 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q9nhf_96554462-85fb-43e0-998f-5e9898c338b8/kube-rbac-proxy/0.log" Apr 23 18:16:29.751810 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:29.751784 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q9nhf_96554462-85fb-43e0-998f-5e9898c338b8/exporter/0.log" Apr 23 18:16:29.775095 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:29.775075 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q9nhf_96554462-85fb-43e0-998f-5e9898c338b8/extractor/0.log" Apr 23 18:16:31.753520 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:31.753489 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6fc5d867c5-92hgn_75e7b965-7ae8-4e5c-8ef3-e321978caeba/manager/0.log" Apr 23 18:16:31.797819 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:31.797786 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-m2pk5_0e1cd28c-87d2-4ce6-9c87-bf429af34772/server/0.log" Apr 23 18:16:31.909661 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:31.909631 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-d8s5w_623c2c74-3b8f-477a-9d89-a686c3f7d236/s3-init/0.log" Apr 23 18:16:33.978744 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:33.978717 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-46lhs/perf-node-gather-daemonset-bhh7t" Apr 23 18:16:37.188448 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:37.188409 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2zdbf_ed24d47a-1604-4017-b22f-4d68978cdb27/kube-multus/0.log" Apr 23 18:16:37.455057 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:37.454990 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4bt4_fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b/kube-multus-additional-cni-plugins/0.log" Apr 23 18:16:37.477496 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:37.477470 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4bt4_fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b/egress-router-binary-copy/0.log" Apr 23 18:16:37.500221 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:37.500198 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4bt4_fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b/cni-plugins/0.log" Apr 23 18:16:37.522480 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:37.522460 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4bt4_fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b/bond-cni-plugin/0.log" Apr 23 18:16:37.544005 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:37.543973 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4bt4_fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b/routeoverride-cni/0.log" Apr 23 18:16:37.566469 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:37.566447 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4bt4_fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b/whereabouts-cni-bincopy/0.log" Apr 23 18:16:37.589991 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:37.589969 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4bt4_fc6e02b3-9721-4d2d-aa26-4ed9f4f1607b/whereabouts-cni/0.log" Apr 23 18:16:37.840764 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:37.840732 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gnxs9_67c4e3ae-cc88-433d-8549-c77153e2e1d6/network-metrics-daemon/0.log" Apr 23 18:16:37.862054 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:37.862015 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gnxs9_67c4e3ae-cc88-433d-8549-c77153e2e1d6/kube-rbac-proxy/0.log" Apr 23 18:16:38.883572 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:38.883543 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9sxcg_0f7a35cb-32ab-4b1d-a1de-f711ce9b0045/ovn-controller/0.log" Apr 23 18:16:38.922429 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:38.922402 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9sxcg_0f7a35cb-32ab-4b1d-a1de-f711ce9b0045/ovn-acl-logging/0.log" Apr 23 18:16:38.957906 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:38.957884 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9sxcg_0f7a35cb-32ab-4b1d-a1de-f711ce9b0045/kube-rbac-proxy-node/0.log" Apr 23 18:16:39.006643 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:39.006619 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9sxcg_0f7a35cb-32ab-4b1d-a1de-f711ce9b0045/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 18:16:39.048429 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:39.048398 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9sxcg_0f7a35cb-32ab-4b1d-a1de-f711ce9b0045/northd/0.log" Apr 23 18:16:39.081315 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:39.081288 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9sxcg_0f7a35cb-32ab-4b1d-a1de-f711ce9b0045/nbdb/0.log" Apr 23 18:16:39.115535 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:39.115514 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9sxcg_0f7a35cb-32ab-4b1d-a1de-f711ce9b0045/sbdb/0.log" Apr 23 18:16:39.253410 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:39.253377 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9sxcg_0f7a35cb-32ab-4b1d-a1de-f711ce9b0045/ovnkube-controller/0.log" Apr 23 18:16:40.679272 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:40.679235 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qxm9s_76db35ef-2b60-42ed-bb55-38ec1534f90f/network-check-target-container/0.log" Apr 23 18:16:41.761334 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:41.761299 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-nm928_f8d9bbd2-cf59-434c-a2f6-2abd36d2113b/iptables-alerter/0.log" Apr 23 18:16:42.605657 ip-10-0-130-202 kubenswrapper[2579]: I0423 18:16:42.605623 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-kqfft_12d2f5b6-b903-4c15-857e-f255d7d678fd/tuned/0.log"