Apr 24 23:53:46.898691 ip-10-0-142-30 systemd[1]: Starting Kubernetes Kubelet... Apr 24 23:53:47.312352 ip-10-0-142-30 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:47.312352 ip-10-0-142-30 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 23:53:47.312352 ip-10-0-142-30 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:47.312352 ip-10-0-142-30 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 23:53:47.312352 ip-10-0-142-30 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:47.313893 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.313807 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 23:53:47.316744 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316730 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316745 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316749 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316753 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316755 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316758 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316761 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316764 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316767 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316770 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316773 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316775 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316778 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316781 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316783 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316786 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316789 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:47.316783 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316792 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316800 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316803 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316806 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316809 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316812 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316817 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316820 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316822 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316825 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316828 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316830 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316833 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316835 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316838 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316841 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316844 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316846 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316849 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:47.317189 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316851 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316854 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316856 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316859 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316861 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316864 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316866 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316869 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316873 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316876 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316879 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316882 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316885 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316888 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316892 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316895 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316899 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316902 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316904 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:47.317673 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316907 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316910 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316912 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316915 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316917 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316920 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316923 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316925 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316928 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316931 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316934 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316937 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316939 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316942 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316945 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316947 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316950 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316952 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316955 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316957 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:47.318139 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316960 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316962 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316965 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316968 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316970 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316973 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316975 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316979 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316982 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316984 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.316986 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317353 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317358 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317361 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317363 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317383 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317387 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317391 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317396 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317399 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:47.318642 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317402 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317405 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317408 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317411 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317414 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317416 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317419 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317422 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317425 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317428 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317430 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317433 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317436 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317439 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317442 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317444 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317447 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317449 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317452 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317456 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:47.319127 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317458 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317461 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317463 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317466 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317469 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317471 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317474 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317476 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317479 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317482 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317484 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317487 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317490 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317492 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317495 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317498 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317502 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317506 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317509 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:47.319671 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317513 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317517 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317520 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317522 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317526 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317529 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317531 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317534 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317537 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317540 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317542 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317545 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317548 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317551 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317569 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317572 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317576 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317579 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317582 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317585 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:47.320131 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317587 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317590 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317592 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317595 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317597 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317600 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317603 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317606 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317609 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317612 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317614 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317617 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317620 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317625 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317629 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317631 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317634 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.317637 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317706 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317713 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 23:53:47.320631 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317719 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317724 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317728 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317734 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317740 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317747 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317752 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317757 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317761 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317764 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317767 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317770 2569 flags.go:64] FLAG: --cgroup-root="" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317773 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317776 2569 flags.go:64] FLAG: --client-ca-file="" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317779 2569 flags.go:64] FLAG: --cloud-config="" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317781 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317784 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317792 2569 flags.go:64] FLAG: --cluster-domain="" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317795 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317799 2569 flags.go:64] FLAG: --config-dir="" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317802 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317805 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317810 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317813 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 23:53:47.321117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317816 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317820 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317823 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317826 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317829 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317832 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317835 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317839 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317843 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317846 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317849 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317852 2569 flags.go:64] FLAG: --enable-server="true" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317855 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317860 2569 flags.go:64] FLAG: --event-burst="100" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317864 2569 flags.go:64] FLAG: --event-qps="50" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317867 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317870 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317873 2569 flags.go:64] FLAG: --eviction-hard="" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317877 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317880 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317883 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317886 2569 flags.go:64] FLAG: --eviction-soft="" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317889 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317892 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317895 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 23:53:47.321709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317900 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317903 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317906 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317909 2569 flags.go:64] FLAG: --feature-gates="" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317913 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317918 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317921 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317924 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317928 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317931 2569 flags.go:64] FLAG: --help="false" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317934 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317937 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317941 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317944 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317947 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317951 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317953 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317956 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317959 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317974 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317978 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317981 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317985 2569 flags.go:64] FLAG: --kube-reserved="" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317988 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 23:53:47.322319 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317991 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317994 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.317998 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318000 2569 flags.go:64] FLAG: --lock-file="" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318003 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318006 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318009 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318015 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318019 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318022 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318025 2569 flags.go:64] FLAG: --logging-format="text" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318031 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318034 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318037 2569 flags.go:64] FLAG: --manifest-url="" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318041 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318048 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318051 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318055 2569 flags.go:64] FLAG: --max-pods="110" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318058 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318061 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318064 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318067 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318070 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318073 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318076 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 23:53:47.322933 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318084 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318087 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318090 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318093 2569 flags.go:64] FLAG: --pod-cidr="" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318096 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318101 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318104 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318107 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318110 2569 flags.go:64] FLAG: --port="10250" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318113 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318117 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d0ab115df0ccff63" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318120 2569 flags.go:64] FLAG: --qos-reserved="" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318123 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318126 2569 flags.go:64] FLAG: --register-node="true" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318129 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318132 2569 flags.go:64] FLAG: --register-with-taints="" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318138 2569 flags.go:64] FLAG: --registry-burst="10" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318141 2569 flags.go:64] FLAG: --registry-qps="5" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318145 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318149 2569 flags.go:64] FLAG: --reserved-memory="" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318152 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318156 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318159 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318162 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318165 2569 flags.go:64] FLAG: --runonce="false" Apr 24 23:53:47.323547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318168 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318171 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318175 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318178 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318181 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318183 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318187 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318190 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318193 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318196 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318199 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318201 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318204 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318207 2569 flags.go:64] FLAG: --system-cgroups="" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318210 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318215 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318218 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318221 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318225 2569 flags.go:64] FLAG: --tls-min-version="" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318228 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318231 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318234 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318237 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318241 2569 flags.go:64] FLAG: --v="2" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318245 2569 flags.go:64] FLAG: --version="false" Apr 24 23:53:47.324407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318251 2569 flags.go:64] FLAG: --vmodule="" Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318256 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.318259 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318351 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318355 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318384 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318388 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318391 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318403 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318406 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318410 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318413 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318416 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318418 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318421 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318424 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318426 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318429 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318432 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:47.325517 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318435 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318437 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318440 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318442 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318445 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318448 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318450 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318453 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318456 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318458 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318460 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318464 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318467 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318475 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318477 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318480 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318483 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318485 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318488 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318491 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:47.326272 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318494 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318497 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318499 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318502 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318504 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318507 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318510 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318512 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318515 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318517 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318520 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318522 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318525 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318527 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318530 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318533 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318535 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318538 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318541 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318543 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:47.327094 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318546 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318548 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318551 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318555 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318558 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318562 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318564 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318567 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318570 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318572 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318574 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318577 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318580 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318583 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318586 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318588 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318591 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318593 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318597 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:47.327634 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318600 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318603 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318606 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318609 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318611 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318614 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318617 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318619 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318622 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318624 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.318627 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.319134 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.327703 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.327724 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327804 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327812 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:47.328108 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327817 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327822 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327827 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327832 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327837 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327841 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327845 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327850 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327854 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327858 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327863 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327867 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327871 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327875 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327879 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327884 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327889 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327893 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327896 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327899 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:47.328528 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327901 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327904 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327907 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327909 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327912 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327914 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327918 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327920 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327923 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327926 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327929 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327933 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327938 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327942 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327946 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327949 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327952 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327955 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327958 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:47.329053 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327961 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327963 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327968 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327971 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327974 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327977 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327979 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327982 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327985 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327987 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327990 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327993 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327995 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.327998 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328000 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328003 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328005 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328008 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328011 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:47.329533 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328013 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328016 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328018 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328022 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328025 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328028 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328030 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328033 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328035 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328038 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328040 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328043 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328045 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328048 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328050 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328053 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328056 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328059 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328061 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:47.329988 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328063 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328066 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328068 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328071 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328074 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328076 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328079 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.328084 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328177 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328182 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328186 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328190 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328194 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328197 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328199 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328202 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:47.330497 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328205 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328208 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328210 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328213 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328215 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328218 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328221 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328223 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328225 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328228 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328230 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328233 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328236 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328238 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328241 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328244 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328246 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328249 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328251 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328254 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:47.330903 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328256 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328259 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328263 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328267 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328270 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328273 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328276 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328278 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328281 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328284 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328286 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328289 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328292 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328295 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328297 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328300 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328303 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328305 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328308 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:47.331494 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328311 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328313 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328316 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328318 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328320 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328323 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328325 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328328 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328331 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328333 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328336 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328339 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328341 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328344 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328347 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328349 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328352 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328354 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328357 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328359 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:47.331974 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328362 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328381 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328386 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328389 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328392 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328395 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328397 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328400 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328402 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328405 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328408 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328410 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328413 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328415 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328418 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328420 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328422 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328425 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:47.332470 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:47.328427 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:47.332937 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.328432 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:47.332937 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.328997 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 23:53:47.332937 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.331020 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 23:53:47.332937 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.332061 2569 server.go:1019] "Starting client certificate rotation" Apr 24 23:53:47.332937 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.332158 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:47.332937 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.332194 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:47.355669 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.355650 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:47.358124 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.358102 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:47.371670 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.371646 2569 log.go:25] "Validated CRI v1 runtime API" Apr 24 23:53:47.377135 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.377120 2569 log.go:25] "Validated CRI v1 image API" Apr 24 23:53:47.378289 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.378274 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 23:53:47.382928 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.382903 2569 fs.go:135] Filesystem UUIDs: map[2e9afd93-5c56-4d92-a838-a1607c4e1447:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 ea5ceb0c-5c31-46ec-8648-6cdd8e638128:/dev/nvme0n1p3] Apr 24 23:53:47.383030 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.382925 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 23:53:47.386698 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.386681 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:47.389930 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.389817 2569 manager.go:217] Machine: {Timestamp:2026-04-24 23:53:47.388064772 +0000 UTC m=+0.378468776 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096431 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2807a270020320b2e93fc80aaadcde SystemUUID:ec2807a2-7002-0320-b2e9-3fc80aaadcde BootID:8b1aa948-f98b-471f-a076-50e7fdca84b6 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:77:58:ce:69:4b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:77:58:ce:69:4b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:3a:20:4c:a1:82 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 23:53:47.389930 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.389919 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 23:53:47.390094 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.390031 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 23:53:47.391269 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.391246 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:53:47.391450 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.391271 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-30.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:53:47.391541 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.391464 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 23:53:47.391541 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.391475 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 23:53:47.391541 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.391493 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:47.392300 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.392288 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:47.393565 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.393552 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:47.393687 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.393676 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 23:53:47.397051 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.397039 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 24 23:53:47.397120 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.397060 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:53:47.397120 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.397082 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 23:53:47.397120 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.397096 2569 kubelet.go:397] "Adding apiserver pod source" Apr 24 23:53:47.397120 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.397108 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:53:47.398311 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.398298 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:47.398407 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.398321 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:47.400866 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.400848 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 23:53:47.402642 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.402627 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:53:47.403144 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.403129 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dz8zj" Apr 24 23:53:47.404240 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404216 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 23:53:47.404240 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404234 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 23:53:47.404240 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404241 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 23:53:47.404401 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404246 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 23:53:47.404401 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404251 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 23:53:47.404401 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404257 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 23:53:47.404401 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404262 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 23:53:47.404401 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404268 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 23:53:47.404401 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404275 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 23:53:47.404401 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404281 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 23:53:47.404401 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404289 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 23:53:47.404401 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.404297 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 23:53:47.405177 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.405165 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 23:53:47.405216 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.405179 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 23:53:47.407724 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.407705 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-30.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 23:53:47.408404 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.408383 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-30.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 23:53:47.408453 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.408382 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 23:53:47.408843 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.408830 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 23:53:47.408886 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.408866 2569 server.go:1295] "Started kubelet" Apr 24 23:53:47.408970 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.408930 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:53:47.409075 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.409029 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:53:47.409132 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.409097 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 23:53:47.409587 ip-10-0-142-30 systemd[1]: Started Kubernetes Kubelet. Apr 24 23:53:47.410612 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.410594 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:53:47.410831 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.410815 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dz8zj" Apr 24 23:53:47.411398 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.411359 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:53:47.413565 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.412760 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-30.ec2.internal.18a97023e2fe1c45 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-30.ec2.internal,UID:ip-10-0-142-30.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-30.ec2.internal,},FirstTimestamp:2026-04-24 23:53:47.408841797 +0000 UTC m=+0.399245804,LastTimestamp:2026-04-24 23:53:47.408841797 +0000 UTC m=+0.399245804,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-30.ec2.internal,}" Apr 24 23:53:47.415398 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.415356 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:47.415877 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.415857 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 23:53:47.416613 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.416579 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 23:53:47.416613 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.416580 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 23:53:47.416613 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.416613 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 23:53:47.416781 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.416610 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-30.ec2.internal\" not found" Apr 24 23:53:47.416781 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.416684 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 24 23:53:47.416781 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.416691 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 24 23:53:47.417072 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.416908 2569 factory.go:55] Registering systemd factory Apr 24 23:53:47.417147 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.417105 2569 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:53:47.417318 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.417305 2569 factory.go:153] Registering CRI-O factory Apr 24 23:53:47.417399 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.417323 2569 factory.go:223] Registration of the crio container factory successfully Apr 24 23:53:47.417449 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.417421 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 23:53:47.417449 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.417441 2569 factory.go:103] Registering Raw factory Apr 24 23:53:47.417449 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.417451 2569 manager.go:1196] Started watching for new ooms in manager Apr 24 23:53:47.417805 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.417791 2569 manager.go:319] Starting recovery of all containers Apr 24 23:53:47.419901 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.419805 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 23:53:47.424887 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.424847 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 23:53:47.428187 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.428166 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:47.429945 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.429926 2569 manager.go:324] Recovery completed Apr 24 23:53:47.430859 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.430836 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-30.ec2.internal\" not found" node="ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.434450 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.434438 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.436950 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.436933 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.437015 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.436964 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.437015 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.436978 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.437456 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.437441 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 23:53:47.437492 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.437458 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 23:53:47.437492 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.437486 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:47.440626 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.440614 2569 policy_none.go:49] "None policy: Start" Apr 24 23:53:47.440671 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.440630 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 23:53:47.440671 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.440640 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 24 23:53:47.475929 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.475915 2569 manager.go:341] "Starting Device Plugin manager" Apr 24 23:53:47.497897 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.475943 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:53:47.497897 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.475954 2569 server.go:85] "Starting device plugin registration server" Apr 24 23:53:47.497897 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.476201 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 23:53:47.497897 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.476214 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:53:47.497897 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.476356 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 23:53:47.497897 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.476439 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 23:53:47.497897 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.476446 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 23:53:47.497897 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.476870 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 23:53:47.497897 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.476902 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-30.ec2.internal\" not found" Apr 24 23:53:47.547801 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.547777 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 23:53:47.547801 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.547806 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 23:53:47.547959 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.547822 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:53:47.547959 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.547833 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 23:53:47.547959 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.547868 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 23:53:47.550723 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.550703 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:47.576776 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.576733 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.580457 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.580440 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.580535 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.580471 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.580535 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.580481 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.580535 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.580503 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.588324 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.588309 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.588413 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.588329 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-30.ec2.internal\": node \"ip-10-0-142-30.ec2.internal\" not found" Apr 24 23:53:47.601934 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.601915 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-30.ec2.internal\" not found" Apr 24 23:53:47.648779 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.648734 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-30.ec2.internal"] Apr 24 23:53:47.648888 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.648829 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.650716 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.650699 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.650808 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.650733 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.650808 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.650750 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.653165 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.653149 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.653298 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.653283 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.653347 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.653312 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.656832 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.656817 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.656908 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.656839 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.656908 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.656849 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.656908 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.656823 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.656908 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.656904 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.657050 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.656913 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.659136 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.659115 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.659136 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.659137 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.659778 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.659764 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.659840 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.659790 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.659840 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.659801 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.685941 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.685921 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-30.ec2.internal\" not found" node="ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.690166 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.690149 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-30.ec2.internal\" not found" node="ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.702198 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.702183 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-30.ec2.internal\" not found" Apr 24 23:53:47.802316 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.802288 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-30.ec2.internal\" not found" Apr 24 23:53:47.818704 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.818681 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b6448547fa43b3d7dc0fab15e5ba4e93-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal\" (UID: \"b6448547fa43b3d7dc0fab15e5ba4e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.818758 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.818723 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6448547fa43b3d7dc0fab15e5ba4e93-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal\" (UID: \"b6448547fa43b3d7dc0fab15e5ba4e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.818758 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.818746 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/02bbe2448f51d29d628074b87ed0230a-config\") pod \"kube-apiserver-proxy-ip-10-0-142-30.ec2.internal\" (UID: \"02bbe2448f51d29d628074b87ed0230a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.902993 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:47.902961 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-30.ec2.internal\" not found" Apr 24 23:53:47.919336 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.919308 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/02bbe2448f51d29d628074b87ed0230a-config\") pod \"kube-apiserver-proxy-ip-10-0-142-30.ec2.internal\" (UID: \"02bbe2448f51d29d628074b87ed0230a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.919417 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.919352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b6448547fa43b3d7dc0fab15e5ba4e93-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal\" (UID: \"b6448547fa43b3d7dc0fab15e5ba4e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.919459 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.919395 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6448547fa43b3d7dc0fab15e5ba4e93-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal\" (UID: \"b6448547fa43b3d7dc0fab15e5ba4e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.919459 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.919427 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/02bbe2448f51d29d628074b87ed0230a-config\") pod \"kube-apiserver-proxy-ip-10-0-142-30.ec2.internal\" (UID: \"02bbe2448f51d29d628074b87ed0230a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.919528 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.919459 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6448547fa43b3d7dc0fab15e5ba4e93-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal\" (UID: \"b6448547fa43b3d7dc0fab15e5ba4e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.919528 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.919432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b6448547fa43b3d7dc0fab15e5ba4e93-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal\" (UID: \"b6448547fa43b3d7dc0fab15e5ba4e93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.987441 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.987401 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" Apr 24 23:53:47.992102 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:47.992082 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-30.ec2.internal" Apr 24 23:53:48.003713 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.003691 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-30.ec2.internal\" not found" Apr 24 23:53:48.104474 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.104425 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-30.ec2.internal\" not found" Apr 24 23:53:48.205026 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.204955 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-30.ec2.internal\" not found" Apr 24 23:53:48.305455 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.305421 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-30.ec2.internal\" not found" Apr 24 23:53:48.331923 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.331901 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 23:53:48.332560 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.332051 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:48.332560 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.332078 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:48.375676 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.375657 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:48.397433 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.397404 2569 apiserver.go:52] "Watching apiserver" Apr 24 23:53:48.407832 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.407806 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 23:53:48.408855 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.408834 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-zklf4","openshift-cluster-node-tuning-operator/tuned-fmgbl","openshift-dns/node-resolver-f9b5d","openshift-image-registry/node-ca-2skl8","openshift-multus/multus-c8t2m","openshift-multus/network-metrics-daemon-z72wj","openshift-network-diagnostics/network-check-target-zsqnv","openshift-network-operator/iptables-alerter-b9sll","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4","openshift-multus/multus-additional-cni-plugins-p5fs9","openshift-ovn-kubernetes/ovnkube-node-v9qcn"] Apr 24 23:53:48.412654 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.412630 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 23:48:47 +0000 UTC" deadline="2027-09-18 17:09:48.974523282 +0000 UTC" Apr 24 23:53:48.412654 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.412653 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12281h16m0.561873325s" Apr 24 23:53:48.414102 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.414083 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:53:48.415530 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.415511 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:48.416151 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.416132 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 23:53:48.416202 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.416159 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" Apr 24 23:53:48.416253 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.416236 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-l24q2\"" Apr 24 23:53:48.416291 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.416268 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 23:53:48.416807 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.416789 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.418701 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.418683 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tzfnr\"" Apr 24 23:53:48.418792 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.418688 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:48.418792 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.418690 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:48.419538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.419525 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f9b5d" Apr 24 23:53:48.421243 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.421226 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 23:53:48.421424 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.421408 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d9czf\"" Apr 24 23:53:48.421504 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.421456 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 23:53:48.422119 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422105 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2skl8" Apr 24 23:53:48.422224 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422206 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-lib-modules\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422277 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422241 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-kubernetes\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422328 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422296 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-run\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422415 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422329 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-host\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422415 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422386 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-tuned\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422516 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422413 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d49aa07c-0862-4d3d-85c1-e60a04019252-hosts-file\") pod \"node-resolver-f9b5d\" (UID: \"d49aa07c-0862-4d3d-85c1-e60a04019252\") " pod="openshift-dns/node-resolver-f9b5d" Apr 24 23:53:48.422516 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8d604d02-f21f-4a15-84ae-2c0a50e8c899-agent-certs\") pod \"konnectivity-agent-zklf4\" (UID: \"8d604d02-f21f-4a15-84ae-2c0a50e8c899\") " pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:53:48.422516 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422485 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-sysconfig\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422622 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422515 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/824d5c7b-4cdc-426b-9c01-0331c41c1293-tmp\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422622 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422545 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhwf4\" (UniqueName: \"kubernetes.io/projected/824d5c7b-4cdc-426b-9c01-0331c41c1293-kube-api-access-rhwf4\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422622 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422572 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d49aa07c-0862-4d3d-85c1-e60a04019252-tmp-dir\") pod \"node-resolver-f9b5d\" (UID: \"d49aa07c-0862-4d3d-85c1-e60a04019252\") " pod="openshift-dns/node-resolver-f9b5d" Apr 24 23:53:48.422622 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbhc8\" (UniqueName: \"kubernetes.io/projected/d49aa07c-0862-4d3d-85c1-e60a04019252-kube-api-access-rbhc8\") pod \"node-resolver-f9b5d\" (UID: \"d49aa07c-0862-4d3d-85c1-e60a04019252\") " pod="openshift-dns/node-resolver-f9b5d" Apr 24 23:53:48.422622 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422618 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8d604d02-f21f-4a15-84ae-2c0a50e8c899-konnectivity-ca\") pod \"konnectivity-agent-zklf4\" (UID: \"8d604d02-f21f-4a15-84ae-2c0a50e8c899\") " pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:53:48.422831 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422632 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-sysctl-d\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422831 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422647 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-sysctl-conf\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422831 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422677 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-systemd\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422831 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422723 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-sys\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422831 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422748 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-var-lib-kubelet\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.422831 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.422789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-modprobe-d\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.424006 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.423989 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 23:53:48.424089 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.424043 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-n62l4\"" Apr 24 23:53:48.424159 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.424147 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 23:53:48.424223 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.424161 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.424275 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.424248 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:48.424323 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.424297 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 23:53:48.424323 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.424296 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:53:48.426108 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.426094 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fgsl7\"" Apr 24 23:53:48.426480 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.426463 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 23:53:48.426555 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.426527 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 23:53:48.426624 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.426611 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 23:53:48.426690 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.426673 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:48.426762 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.426750 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-30.ec2.internal" Apr 24 23:53:48.426819 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.426807 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 23:53:48.427203 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.427185 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:48.427318 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.427233 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:53:48.429679 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.429661 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b9sll" Apr 24 23:53:48.431557 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.431541 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:48.431895 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.431868 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.431895 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.431883 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n52cn\"" Apr 24 23:53:48.432040 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.431940 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 23:53:48.432439 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.432420 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:48.432614 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.432594 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:48.434713 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.434678 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 23:53:48.434924 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.434897 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:48.434988 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.434935 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5b882\"" Apr 24 23:53:48.435570 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.435205 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 23:53:48.435570 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.435264 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 23:53:48.436064 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.436048 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.437961 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.437945 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 23:53:48.438042 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.437989 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-r5stf\"" Apr 24 23:53:48.438109 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.437989 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 23:53:48.438518 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.438503 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal"] Apr 24 23:53:48.438583 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.438529 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-142-30.ec2.internal"] Apr 24 23:53:48.438633 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.438624 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.440462 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.440444 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 23:53:48.440462 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.440459 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-q4glt\"" Apr 24 23:53:48.440615 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.440451 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 23:53:48.440671 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.440635 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 23:53:48.440671 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.440650 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 23:53:48.440771 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.440732 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 23:53:48.441118 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.441100 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 23:53:48.450035 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.450013 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-r8kh4" Apr 24 23:53:48.458387 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.458355 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-r8kh4" Apr 24 23:53:48.517462 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.517441 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 23:53:48.523167 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523146 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-hostroot\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.523271 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523173 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh6jk\" (UniqueName: \"kubernetes.io/projected/865ada3d-5576-4faa-98c1-2b867558ffc0-kube-api-access-sh6jk\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:48.523271 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523199 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fwxv\" (UniqueName: \"kubernetes.io/projected/ed715705-a5a4-4ee1-b874-58c1cb13ea71-kube-api-access-5fwxv\") pod \"iptables-alerter-b9sll\" (UID: \"ed715705-a5a4-4ee1-b874-58c1cb13ea71\") " pod="openshift-network-operator/iptables-alerter-b9sll" Apr 24 23:53:48.523271 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-socket-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.523271 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-run-netns\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.523531 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523283 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c43aa69-d515-46b6-8b76-dd50b64985c6-ovnkube-config\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.523531 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523306 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-run-netns\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.523531 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523357 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-lib-modules\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.523531 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523406 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.523531 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523437 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-run-ovn\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.523531 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523508 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-lib-modules\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.523774 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523593 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.523774 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:48.523774 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523692 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed715705-a5a4-4ee1-b874-58c1cb13ea71-iptables-alerter-script\") pod \"iptables-alerter-b9sll\" (UID: \"ed715705-a5a4-4ee1-b874-58c1cb13ea71\") " pod="openshift-network-operator/iptables-alerter-b9sll" Apr 24 23:53:48.523774 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523713 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-log-socket\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.523774 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-sysctl-d\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.523983 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523789 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-sysctl-conf\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.523983 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523810 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-systemd\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.523983 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523839 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ktzp\" (UniqueName: \"kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp\") pod \"network-check-target-zsqnv\" (UID: \"2bca4834-ef71-4be6-ac75-bf2bc736877d\") " pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:48.523983 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523865 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-system-cni-dir\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.523983 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523890 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-cni-binary-copy\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.523983 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523906 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-systemd\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.523983 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523911 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-daemon-config\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.523983 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523935 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-sysctl-d\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.523983 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523943 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-sysctl-conf\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.523983 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.523962 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dpwf\" (UniqueName: \"kubernetes.io/projected/273443fc-e308-4ad5-8ab7-60e95b67db82-kube-api-access-2dpwf\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.524333 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524010 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-slash\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.524333 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524056 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-cni-bin\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.524333 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524080 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-os-release\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.524333 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524112 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d49aa07c-0862-4d3d-85c1-e60a04019252-hosts-file\") pod \"node-resolver-f9b5d\" (UID: \"d49aa07c-0862-4d3d-85c1-e60a04019252\") " pod="openshift-dns/node-resolver-f9b5d" Apr 24 23:53:48.524333 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524147 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-etc-selinux\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.524333 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524162 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-system-cni-dir\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.524333 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524196 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-cnibin\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.524333 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524259 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d49aa07c-0862-4d3d-85c1-e60a04019252-hosts-file\") pod \"node-resolver-f9b5d\" (UID: \"d49aa07c-0862-4d3d-85c1-e60a04019252\") " pod="openshift-dns/node-resolver-f9b5d" Apr 24 23:53:48.524333 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524312 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-run-multus-certs\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.524706 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524340 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbhc8\" (UniqueName: \"kubernetes.io/projected/d49aa07c-0862-4d3d-85c1-e60a04019252-kube-api-access-rbhc8\") pod \"node-resolver-f9b5d\" (UID: \"d49aa07c-0862-4d3d-85c1-e60a04019252\") " pod="openshift-dns/node-resolver-f9b5d" Apr 24 23:53:48.524706 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524362 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-var-lib-openvswitch\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.524706 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524406 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-etc-openvswitch\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.524706 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524505 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-etc-kubernetes\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.524706 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524545 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcncj\" (UniqueName: \"kubernetes.io/projected/af315a28-630d-4b83-bf3b-2b3421fa929f-kube-api-access-lcncj\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.524706 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524576 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed715705-a5a4-4ee1-b874-58c1cb13ea71-host-slash\") pod \"iptables-alerter-b9sll\" (UID: \"ed715705-a5a4-4ee1-b874-58c1cb13ea71\") " pod="openshift-network-operator/iptables-alerter-b9sll" Apr 24 23:53:48.524706 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524603 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-sys\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.524706 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524629 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-registration-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.524706 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524656 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-node-log\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.524706 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524680 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.524706 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524708 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-cni-dir\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524715 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-sys\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-conf-dir\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524760 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-modprobe-d\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524788 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxnl\" (UniqueName: \"kubernetes.io/projected/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-kube-api-access-9cxnl\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524812 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-run-systemd\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524857 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-modprobe-d\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524893 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-kubernetes\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524940 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-host\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524954 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-kubernetes\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.524987 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-host\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525007 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-kubelet-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525023 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9xn9\" (UniqueName: \"kubernetes.io/projected/2f91e8cd-3f4c-4228-8c96-94fcc37e0124-kube-api-access-b9xn9\") pod \"node-ca-2skl8\" (UID: \"2f91e8cd-3f4c-4228-8c96-94fcc37e0124\") " pod="openshift-image-registry/node-ca-2skl8" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-sysconfig\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525056 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f91e8cd-3f4c-4228-8c96-94fcc37e0124-serviceca\") pod \"node-ca-2skl8\" (UID: \"2f91e8cd-3f4c-4228-8c96-94fcc37e0124\") " pod="openshift-image-registry/node-ca-2skl8" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525070 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-kubelet\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-run-openvswitch\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.525083 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525095 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-sysconfig\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525121 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c43aa69-d515-46b6-8b76-dd50b64985c6-ovn-node-metrics-cert\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525172 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjxj8\" (UniqueName: \"kubernetes.io/projected/9c43aa69-d515-46b6-8b76-dd50b64985c6-kube-api-access-gjxj8\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525202 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-socket-dir-parent\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525230 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8d604d02-f21f-4a15-84ae-2c0a50e8c899-konnectivity-ca\") pod \"konnectivity-agent-zklf4\" (UID: \"8d604d02-f21f-4a15-84ae-2c0a50e8c899\") " pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525259 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-os-release\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525284 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c43aa69-d515-46b6-8b76-dd50b64985c6-env-overrides\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525326 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-var-lib-cni-bin\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525352 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-cnibin\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525390 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-cni-netd\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525419 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af315a28-630d-4b83-bf3b-2b3421fa929f-cni-binary-copy\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525444 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-run-k8s-cni-cncf-io\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525472 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-var-lib-cni-multus\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525498 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-run\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-tuned\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525563 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-device-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525591 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.525892 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525596 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-run\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525616 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-systemd-units\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525645 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c43aa69-d515-46b6-8b76-dd50b64985c6-ovnkube-script-lib\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525668 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-sys-fs\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8d604d02-f21f-4a15-84ae-2c0a50e8c899-agent-certs\") pod \"konnectivity-agent-zklf4\" (UID: \"8d604d02-f21f-4a15-84ae-2c0a50e8c899\") " pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525726 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8d604d02-f21f-4a15-84ae-2c0a50e8c899-konnectivity-ca\") pod \"konnectivity-agent-zklf4\" (UID: \"8d604d02-f21f-4a15-84ae-2c0a50e8c899\") " pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525745 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/824d5c7b-4cdc-426b-9c01-0331c41c1293-tmp\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525830 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhwf4\" (UniqueName: \"kubernetes.io/projected/824d5c7b-4cdc-426b-9c01-0331c41c1293-kube-api-access-rhwf4\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525891 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d49aa07c-0862-4d3d-85c1-e60a04019252-tmp-dir\") pod \"node-resolver-f9b5d\" (UID: \"d49aa07c-0862-4d3d-85c1-e60a04019252\") " pod="openshift-dns/node-resolver-f9b5d" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525927 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525962 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-var-lib-kubelet\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.525976 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.526031 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-var-lib-kubelet\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.526079 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f91e8cd-3f4c-4228-8c96-94fcc37e0124-host\") pod \"node-ca-2skl8\" (UID: \"2f91e8cd-3f4c-4228-8c96-94fcc37e0124\") " pod="openshift-image-registry/node-ca-2skl8" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.526091 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/824d5c7b-4cdc-426b-9c01-0331c41c1293-var-lib-kubelet\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.526722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.526180 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d49aa07c-0862-4d3d-85c1-e60a04019252-tmp-dir\") pod \"node-resolver-f9b5d\" (UID: \"d49aa07c-0862-4d3d-85c1-e60a04019252\") " pod="openshift-dns/node-resolver-f9b5d" Apr 24 23:53:48.529016 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.528982 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/824d5c7b-4cdc-426b-9c01-0331c41c1293-etc-tuned\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.529104 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.529037 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/824d5c7b-4cdc-426b-9c01-0331c41c1293-tmp\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.529303 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.529289 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8d604d02-f21f-4a15-84ae-2c0a50e8c899-agent-certs\") pod \"konnectivity-agent-zklf4\" (UID: \"8d604d02-f21f-4a15-84ae-2c0a50e8c899\") " pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:53:48.531029 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.531009 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbhc8\" (UniqueName: \"kubernetes.io/projected/d49aa07c-0862-4d3d-85c1-e60a04019252-kube-api-access-rbhc8\") pod \"node-resolver-f9b5d\" (UID: \"d49aa07c-0862-4d3d-85c1-e60a04019252\") " pod="openshift-dns/node-resolver-f9b5d" Apr 24 23:53:48.536803 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.536780 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhwf4\" (UniqueName: \"kubernetes.io/projected/824d5c7b-4cdc-426b-9c01-0331c41c1293-kube-api-access-rhwf4\") pod \"tuned-fmgbl\" (UID: \"824d5c7b-4cdc-426b-9c01-0331c41c1293\") " pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.562707 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:48.562682 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6448547fa43b3d7dc0fab15e5ba4e93.slice/crio-1c078eb9fb425f478ed3e0bd17f0d5061b185082efa098b1d79db5c5d8197505 WatchSource:0}: Error finding container 1c078eb9fb425f478ed3e0bd17f0d5061b185082efa098b1d79db5c5d8197505: Status 404 returned error can't find the container with id 1c078eb9fb425f478ed3e0bd17f0d5061b185082efa098b1d79db5c5d8197505 Apr 24 23:53:48.562906 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:48.562887 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02bbe2448f51d29d628074b87ed0230a.slice/crio-6a74ab7842df34931e6cec0d96213270b49dba1154888801c0c8e09825891a46 WatchSource:0}: Error finding container 6a74ab7842df34931e6cec0d96213270b49dba1154888801c0c8e09825891a46: Status 404 returned error can't find the container with id 6a74ab7842df34931e6cec0d96213270b49dba1154888801c0c8e09825891a46 Apr 24 23:53:48.567216 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.567201 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:53:48.626715 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626688 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-cni-dir\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.626715 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626714 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-conf-dir\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.626901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626731 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxnl\" (UniqueName: \"kubernetes.io/projected/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-kube-api-access-9cxnl\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.626901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-run-systemd\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.626901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626763 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-kubelet-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.626901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9xn9\" (UniqueName: \"kubernetes.io/projected/2f91e8cd-3f4c-4228-8c96-94fcc37e0124-kube-api-access-b9xn9\") pod \"node-ca-2skl8\" (UID: \"2f91e8cd-3f4c-4228-8c96-94fcc37e0124\") " pod="openshift-image-registry/node-ca-2skl8" Apr 24 23:53:48.626901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626795 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-conf-dir\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.626901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626803 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f91e8cd-3f4c-4228-8c96-94fcc37e0124-serviceca\") pod \"node-ca-2skl8\" (UID: \"2f91e8cd-3f4c-4228-8c96-94fcc37e0124\") " pod="openshift-image-registry/node-ca-2skl8" Apr 24 23:53:48.626901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626822 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-cni-dir\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.626901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626828 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-run-systemd\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.626901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626857 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-kubelet\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.626901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626874 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-kubelet-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.626901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626895 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-kubelet\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626908 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-run-openvswitch\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626942 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c43aa69-d515-46b6-8b76-dd50b64985c6-ovn-node-metrics-cert\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjxj8\" (UniqueName: \"kubernetes.io/projected/9c43aa69-d515-46b6-8b76-dd50b64985c6-kube-api-access-gjxj8\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.626995 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-socket-dir-parent\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627008 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-run-openvswitch\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-os-release\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627076 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-socket-dir-parent\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627089 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-os-release\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627103 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c43aa69-d515-46b6-8b76-dd50b64985c6-env-overrides\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-var-lib-cni-bin\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627146 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-cnibin\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627164 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-cni-netd\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627202 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-var-lib-cni-bin\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627216 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af315a28-630d-4b83-bf3b-2b3421fa929f-cni-binary-copy\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627228 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-cni-netd\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627255 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-run-k8s-cni-cncf-io\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.627396 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627268 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f91e8cd-3f4c-4228-8c96-94fcc37e0124-serviceca\") pod \"node-ca-2skl8\" (UID: \"2f91e8cd-3f4c-4228-8c96-94fcc37e0124\") " pod="openshift-image-registry/node-ca-2skl8" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627254 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-cnibin\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627284 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-var-lib-cni-multus\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627321 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-var-lib-cni-multus\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627310 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-run-k8s-cni-cncf-io\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627329 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-device-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627385 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-device-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627361 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627423 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-systemd-units\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627447 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c43aa69-d515-46b6-8b76-dd50b64985c6-ovnkube-script-lib\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627484 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-sys-fs\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627488 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-systemd-units\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627514 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627542 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-var-lib-kubelet\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627565 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c43aa69-d515-46b6-8b76-dd50b64985c6-env-overrides\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627566 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f91e8cd-3f4c-4228-8c96-94fcc37e0124-host\") pod \"node-ca-2skl8\" (UID: \"2f91e8cd-3f4c-4228-8c96-94fcc37e0124\") " pod="openshift-image-registry/node-ca-2skl8" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627608 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-hostroot\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.628162 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627632 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sh6jk\" (UniqueName: \"kubernetes.io/projected/865ada3d-5576-4faa-98c1-2b867558ffc0-kube-api-access-sh6jk\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627635 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-sys-fs\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627659 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fwxv\" (UniqueName: \"kubernetes.io/projected/ed715705-a5a4-4ee1-b874-58c1cb13ea71-kube-api-access-5fwxv\") pod \"iptables-alerter-b9sll\" (UID: \"ed715705-a5a4-4ee1-b874-58c1cb13ea71\") " pod="openshift-network-operator/iptables-alerter-b9sll" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627683 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-hostroot\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627685 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-socket-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627725 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-run-netns\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627754 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f91e8cd-3f4c-4228-8c96-94fcc37e0124-host\") pod \"node-ca-2skl8\" (UID: \"2f91e8cd-3f4c-4228-8c96-94fcc37e0124\") " pod="openshift-image-registry/node-ca-2skl8" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627749 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c43aa69-d515-46b6-8b76-dd50b64985c6-ovnkube-config\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-run-netns\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627787 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af315a28-630d-4b83-bf3b-2b3421fa929f-cni-binary-copy\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627816 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627840 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-run-ovn\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627850 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-socket-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627885 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-run-netns\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627908 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-run-netns\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627723 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-var-lib-kubelet\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.628925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627959 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed715705-a5a4-4ee1-b874-58c1cb13ea71-iptables-alerter-script\") pod \"iptables-alerter-b9sll\" (UID: \"ed715705-a5a4-4ee1-b874-58c1cb13ea71\") " pod="openshift-network-operator/iptables-alerter-b9sll" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.627991 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-log-socket\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628025 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktzp\" (UniqueName: \"kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp\") pod \"network-check-target-zsqnv\" (UID: \"2bca4834-ef71-4be6-ac75-bf2bc736877d\") " pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628035 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628056 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-system-cni-dir\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628062 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-run-ovn\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.628076 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628090 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-log-socket\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628082 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-cni-binary-copy\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-daemon-config\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.628166 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs podName:865ada3d-5576-4faa-98c1-2b867558ffc0 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:49.128119672 +0000 UTC m=+2.118523687 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs") pod "network-metrics-daemon-z72wj" (UID: "865ada3d-5576-4faa-98c1-2b867558ffc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628190 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dpwf\" (UniqueName: \"kubernetes.io/projected/273443fc-e308-4ad5-8ab7-60e95b67db82-kube-api-access-2dpwf\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c43aa69-d515-46b6-8b76-dd50b64985c6-ovnkube-config\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628229 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-slash\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.629635 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628255 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-cni-bin\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628267 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628283 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-os-release\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628294 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-system-cni-dir\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628309 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-etc-selinux\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-system-cni-dir\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628414 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-cnibin\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628438 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-run-multus-certs\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628463 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-var-lib-openvswitch\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628496 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-etc-openvswitch\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628522 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-etc-kubernetes\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628517 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-etc-selinux\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628543 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c43aa69-d515-46b6-8b76-dd50b64985c6-ovnkube-script-lib\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcncj\" (UniqueName: \"kubernetes.io/projected/af315a28-630d-4b83-bf3b-2b3421fa929f-kube-api-access-lcncj\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-host-run-multus-certs\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628613 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ed715705-a5a4-4ee1-b874-58c1cb13ea71-iptables-alerter-script\") pod \"iptables-alerter-b9sll\" (UID: \"ed715705-a5a4-4ee1-b874-58c1cb13ea71\") " pod="openshift-network-operator/iptables-alerter-b9sll" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628621 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-os-release\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630068 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628631 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af315a28-630d-4b83-bf3b-2b3421fa929f-multus-daemon-config\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628638 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed715705-a5a4-4ee1-b874-58c1cb13ea71-host-slash\") pod \"iptables-alerter-b9sll\" (UID: \"ed715705-a5a4-4ee1-b874-58c1cb13ea71\") " pod="openshift-network-operator/iptables-alerter-b9sll" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628671 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-etc-openvswitch\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628699 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-registration-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628706 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-cnibin\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628711 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed715705-a5a4-4ee1-b874-58c1cb13ea71-host-slash\") pod \"iptables-alerter-b9sll\" (UID: \"ed715705-a5a4-4ee1-b874-58c1cb13ea71\") " pod="openshift-network-operator/iptables-alerter-b9sll" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628707 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-var-lib-openvswitch\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628725 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-node-log\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628746 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-cni-bin\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628776 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/273443fc-e308-4ad5-8ab7-60e95b67db82-registration-dir\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628749 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-etc-kubernetes\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628793 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af315a28-630d-4b83-bf3b-2b3421fa929f-system-cni-dir\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628818 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-node-log\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628824 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628818 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-slash\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628892 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c43aa69-d515-46b6-8b76-dd50b64985c6-host-run-ovn-kubernetes\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.628946 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-cni-binary-copy\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.630538 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.629171 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.631021 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.629660 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c43aa69-d515-46b6-8b76-dd50b64985c6-ovn-node-metrics-cert\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.634805 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.634783 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:48.634805 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.634802 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:48.634972 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.634815 2569 projected.go:194] Error preparing data for projected volume kube-api-access-6ktzp for pod openshift-network-diagnostics/network-check-target-zsqnv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:48.634972 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:48.634862 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp podName:2bca4834-ef71-4be6-ac75-bf2bc736877d nodeName:}" failed. No retries permitted until 2026-04-24 23:53:49.134850041 +0000 UTC m=+2.125254036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktzp" (UniqueName: "kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp") pod "network-check-target-zsqnv" (UID: "2bca4834-ef71-4be6-ac75-bf2bc736877d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:48.635171 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.635141 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fwxv\" (UniqueName: \"kubernetes.io/projected/ed715705-a5a4-4ee1-b874-58c1cb13ea71-kube-api-access-5fwxv\") pod \"iptables-alerter-b9sll\" (UID: \"ed715705-a5a4-4ee1-b874-58c1cb13ea71\") " pod="openshift-network-operator/iptables-alerter-b9sll" Apr 24 23:53:48.635171 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.635157 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjxj8\" (UniqueName: \"kubernetes.io/projected/9c43aa69-d515-46b6-8b76-dd50b64985c6-kube-api-access-gjxj8\") pod \"ovnkube-node-v9qcn\" (UID: \"9c43aa69-d515-46b6-8b76-dd50b64985c6\") " pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.635353 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.635258 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxnl\" (UniqueName: \"kubernetes.io/projected/bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8-kube-api-access-9cxnl\") pod \"multus-additional-cni-plugins-p5fs9\" (UID: \"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8\") " pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.635437 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.635421 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh6jk\" (UniqueName: \"kubernetes.io/projected/865ada3d-5576-4faa-98c1-2b867558ffc0-kube-api-access-sh6jk\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:48.635496 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.635466 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9xn9\" (UniqueName: \"kubernetes.io/projected/2f91e8cd-3f4c-4228-8c96-94fcc37e0124-kube-api-access-b9xn9\") pod \"node-ca-2skl8\" (UID: \"2f91e8cd-3f4c-4228-8c96-94fcc37e0124\") " pod="openshift-image-registry/node-ca-2skl8" Apr 24 23:53:48.637122 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.637099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dpwf\" (UniqueName: \"kubernetes.io/projected/273443fc-e308-4ad5-8ab7-60e95b67db82-kube-api-access-2dpwf\") pod \"aws-ebs-csi-driver-node-62tl4\" (UID: \"273443fc-e308-4ad5-8ab7-60e95b67db82\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.637242 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.637190 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcncj\" (UniqueName: \"kubernetes.io/projected/af315a28-630d-4b83-bf3b-2b3421fa929f-kube-api-access-lcncj\") pod \"multus-c8t2m\" (UID: \"af315a28-630d-4b83-bf3b-2b3421fa929f\") " pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.755516 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.755434 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:53:48.761801 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:48.761774 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d604d02_f21f_4a15_84ae_2c0a50e8c899.slice/crio-4cc0f78cfd3d09d3c6a5e50ecde49b409a31c90417a089fd89c6b1669aed6176 WatchSource:0}: Error finding container 4cc0f78cfd3d09d3c6a5e50ecde49b409a31c90417a089fd89c6b1669aed6176: Status 404 returned error can't find the container with id 4cc0f78cfd3d09d3c6a5e50ecde49b409a31c90417a089fd89c6b1669aed6176 Apr 24 23:53:48.770709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.770691 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" Apr 24 23:53:48.776967 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:48.776946 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824d5c7b_4cdc_426b_9c01_0331c41c1293.slice/crio-7b0fe0f1229c7962103e6ec802d036e0ff663869f24b035d2a5b6d94bb63e1ef WatchSource:0}: Error finding container 7b0fe0f1229c7962103e6ec802d036e0ff663869f24b035d2a5b6d94bb63e1ef: Status 404 returned error can't find the container with id 7b0fe0f1229c7962103e6ec802d036e0ff663869f24b035d2a5b6d94bb63e1ef Apr 24 23:53:48.787401 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.787382 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f9b5d" Apr 24 23:53:48.792979 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:48.792958 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd49aa07c_0862_4d3d_85c1_e60a04019252.slice/crio-d020f524019756b9b914341b0ecc208befde4d7ee880a55572a0c8b83e1c41e1 WatchSource:0}: Error finding container d020f524019756b9b914341b0ecc208befde4d7ee880a55572a0c8b83e1c41e1: Status 404 returned error can't find the container with id d020f524019756b9b914341b0ecc208befde4d7ee880a55572a0c8b83e1c41e1 Apr 24 23:53:48.799803 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.799784 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2skl8" Apr 24 23:53:48.807833 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:48.807810 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f91e8cd_3f4c_4228_8c96_94fcc37e0124.slice/crio-feeec2d508c551d0459bfccdb8e641c70773530d5236bcc4bb22f02661b319a8 WatchSource:0}: Error finding container feeec2d508c551d0459bfccdb8e641c70773530d5236bcc4bb22f02661b319a8: Status 404 returned error can't find the container with id feeec2d508c551d0459bfccdb8e641c70773530d5236bcc4bb22f02661b319a8 Apr 24 23:53:48.817409 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.817392 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c8t2m" Apr 24 23:53:48.823067 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:48.822889 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf315a28_630d_4b83_bf3b_2b3421fa929f.slice/crio-5866e711c00afadb304bbd50b2728a69c13b3f7ea13f89c828a38e43014a285e WatchSource:0}: Error finding container 5866e711c00afadb304bbd50b2728a69c13b3f7ea13f89c828a38e43014a285e: Status 404 returned error can't find the container with id 5866e711c00afadb304bbd50b2728a69c13b3f7ea13f89c828a38e43014a285e Apr 24 23:53:48.823316 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.823303 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b9sll" Apr 24 23:53:48.828730 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:48.828708 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded715705_a5a4_4ee1_b874_58c1cb13ea71.slice/crio-664e9febad6c51a18477635658f6b89ae02c2d65c5fbb12284ce4fa6da8199c3 WatchSource:0}: Error finding container 664e9febad6c51a18477635658f6b89ae02c2d65c5fbb12284ce4fa6da8199c3: Status 404 returned error can't find the container with id 664e9febad6c51a18477635658f6b89ae02c2d65c5fbb12284ce4fa6da8199c3 Apr 24 23:53:48.830082 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.830068 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" Apr 24 23:53:48.835477 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:48.835458 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod273443fc_e308_4ad5_8ab7_60e95b67db82.slice/crio-e11928431443cbf95fddc63bc7340e04c93363ab2958ec011daaac70e5375bcd WatchSource:0}: Error finding container e11928431443cbf95fddc63bc7340e04c93363ab2958ec011daaac70e5375bcd: Status 404 returned error can't find the container with id e11928431443cbf95fddc63bc7340e04c93363ab2958ec011daaac70e5375bcd Apr 24 23:53:48.859607 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.859587 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:48.863483 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.863453 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p5fs9" Apr 24 23:53:48.868080 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:48.868057 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:53:48.870579 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:48.870558 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbc3d61a_f6ee_4df6_be84_8ee36cb3e6d8.slice/crio-0e412879b376d63f4a8511a3e55473d5337d9a4a934a81f23c473cc03d64bf2b WatchSource:0}: Error finding container 0e412879b376d63f4a8511a3e55473d5337d9a4a934a81f23c473cc03d64bf2b: Status 404 returned error can't find the container with id 0e412879b376d63f4a8511a3e55473d5337d9a4a934a81f23c473cc03d64bf2b Apr 24 23:53:48.874454 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:53:48.874435 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c43aa69_d515_46b6_8b76_dd50b64985c6.slice/crio-d372a3b68199b177205d3085ebbb4201655f7940b4cf0c84a2aa456393c00542 WatchSource:0}: Error finding container d372a3b68199b177205d3085ebbb4201655f7940b4cf0c84a2aa456393c00542: Status 404 returned error can't find the container with id d372a3b68199b177205d3085ebbb4201655f7940b4cf0c84a2aa456393c00542 Apr 24 23:53:49.132660 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.132619 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:49.132820 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:49.132761 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:49.132895 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:49.132826 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs podName:865ada3d-5576-4faa-98c1-2b867558ffc0 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:50.132807156 +0000 UTC m=+3.123211147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs") pod "network-metrics-daemon-z72wj" (UID: "865ada3d-5576-4faa-98c1-2b867558ffc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:49.233200 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.233166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktzp\" (UniqueName: \"kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp\") pod \"network-check-target-zsqnv\" (UID: \"2bca4834-ef71-4be6-ac75-bf2bc736877d\") " pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:49.233383 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:49.233332 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:49.233383 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:49.233350 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:49.233383 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:49.233363 2569 projected.go:194] Error preparing data for projected volume kube-api-access-6ktzp for pod openshift-network-diagnostics/network-check-target-zsqnv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:49.233554 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:49.233436 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp podName:2bca4834-ef71-4be6-ac75-bf2bc736877d nodeName:}" failed. No retries permitted until 2026-04-24 23:53:50.233416718 +0000 UTC m=+3.223820714 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktzp" (UniqueName: "kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp") pod "network-check-target-zsqnv" (UID: "2bca4834-ef71-4be6-ac75-bf2bc736877d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:49.459709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.459622 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:48 +0000 UTC" deadline="2027-09-18 10:18:19.44653517 +0000 UTC" Apr 24 23:53:49.459709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.459661 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12274h24m29.986877481s" Apr 24 23:53:49.547588 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.547558 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:49.581683 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.581617 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5fs9" event={"ID":"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8","Type":"ContainerStarted","Data":"0e412879b376d63f4a8511a3e55473d5337d9a4a934a81f23c473cc03d64bf2b"} Apr 24 23:53:49.588803 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.588765 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" event={"ID":"273443fc-e308-4ad5-8ab7-60e95b67db82","Type":"ContainerStarted","Data":"e11928431443cbf95fddc63bc7340e04c93363ab2958ec011daaac70e5375bcd"} Apr 24 23:53:49.600089 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.600056 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b9sll" event={"ID":"ed715705-a5a4-4ee1-b874-58c1cb13ea71","Type":"ContainerStarted","Data":"664e9febad6c51a18477635658f6b89ae02c2d65c5fbb12284ce4fa6da8199c3"} Apr 24 23:53:49.609301 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.609265 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zklf4" event={"ID":"8d604d02-f21f-4a15-84ae-2c0a50e8c899","Type":"ContainerStarted","Data":"4cc0f78cfd3d09d3c6a5e50ecde49b409a31c90417a089fd89c6b1669aed6176"} Apr 24 23:53:49.619692 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.619658 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-30.ec2.internal" event={"ID":"02bbe2448f51d29d628074b87ed0230a","Type":"ContainerStarted","Data":"6a74ab7842df34931e6cec0d96213270b49dba1154888801c0c8e09825891a46"} Apr 24 23:53:49.638681 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.638642 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" event={"ID":"b6448547fa43b3d7dc0fab15e5ba4e93","Type":"ContainerStarted","Data":"1c078eb9fb425f478ed3e0bd17f0d5061b185082efa098b1d79db5c5d8197505"} Apr 24 23:53:49.648435 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.648336 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" event={"ID":"9c43aa69-d515-46b6-8b76-dd50b64985c6","Type":"ContainerStarted","Data":"d372a3b68199b177205d3085ebbb4201655f7940b4cf0c84a2aa456393c00542"} Apr 24 23:53:49.661696 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.661607 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c8t2m" event={"ID":"af315a28-630d-4b83-bf3b-2b3421fa929f","Type":"ContainerStarted","Data":"5866e711c00afadb304bbd50b2728a69c13b3f7ea13f89c828a38e43014a285e"} Apr 24 23:53:49.675043 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.674968 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2skl8" event={"ID":"2f91e8cd-3f4c-4228-8c96-94fcc37e0124","Type":"ContainerStarted","Data":"feeec2d508c551d0459bfccdb8e641c70773530d5236bcc4bb22f02661b319a8"} Apr 24 23:53:49.696623 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.696565 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f9b5d" event={"ID":"d49aa07c-0862-4d3d-85c1-e60a04019252","Type":"ContainerStarted","Data":"d020f524019756b9b914341b0ecc208befde4d7ee880a55572a0c8b83e1c41e1"} Apr 24 23:53:49.719772 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.719664 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" event={"ID":"824d5c7b-4cdc-426b-9c01-0331c41c1293","Type":"ContainerStarted","Data":"7b0fe0f1229c7962103e6ec802d036e0ff663869f24b035d2a5b6d94bb63e1ef"} Apr 24 23:53:49.899452 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:49.899420 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:50.144363 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:50.144314 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:50.144542 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:50.144487 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:50.144613 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:50.144558 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs podName:865ada3d-5576-4faa-98c1-2b867558ffc0 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:52.144537984 +0000 UTC m=+5.134941984 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs") pod "network-metrics-daemon-z72wj" (UID: "865ada3d-5576-4faa-98c1-2b867558ffc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:50.244780 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:50.244743 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktzp\" (UniqueName: \"kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp\") pod \"network-check-target-zsqnv\" (UID: \"2bca4834-ef71-4be6-ac75-bf2bc736877d\") " pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:50.244961 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:50.244908 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:50.244961 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:50.244930 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:50.244961 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:50.244942 2569 projected.go:194] Error preparing data for projected volume kube-api-access-6ktzp for pod openshift-network-diagnostics/network-check-target-zsqnv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:50.245115 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:50.245000 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp podName:2bca4834-ef71-4be6-ac75-bf2bc736877d nodeName:}" failed. No retries permitted until 2026-04-24 23:53:52.244982083 +0000 UTC m=+5.235386122 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktzp" (UniqueName: "kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp") pod "network-check-target-zsqnv" (UID: "2bca4834-ef71-4be6-ac75-bf2bc736877d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:50.460267 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:50.460172 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:48 +0000 UTC" deadline="2027-10-04 03:01:12.653384662 +0000 UTC" Apr 24 23:53:50.460267 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:50.460216 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12651h7m22.19317313s" Apr 24 23:53:50.548494 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:50.548389 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:50.548665 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:50.548528 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:53:50.548985 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:50.548387 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:50.548985 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:50.548911 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:53:52.161069 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:52.161033 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:52.161554 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:52.161214 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:52.161554 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:52.161283 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs podName:865ada3d-5576-4faa-98c1-2b867558ffc0 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:56.16126556 +0000 UTC m=+9.151669551 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs") pod "network-metrics-daemon-z72wj" (UID: "865ada3d-5576-4faa-98c1-2b867558ffc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:52.261927 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:52.261888 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktzp\" (UniqueName: \"kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp\") pod \"network-check-target-zsqnv\" (UID: \"2bca4834-ef71-4be6-ac75-bf2bc736877d\") " pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:52.262088 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:52.262066 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:52.262088 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:52.262086 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:52.262205 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:52.262100 2569 projected.go:194] Error preparing data for projected volume kube-api-access-6ktzp for pod openshift-network-diagnostics/network-check-target-zsqnv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:52.262205 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:52.262172 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp podName:2bca4834-ef71-4be6-ac75-bf2bc736877d nodeName:}" failed. No retries permitted until 2026-04-24 23:53:56.262152013 +0000 UTC m=+9.252556009 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktzp" (UniqueName: "kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp") pod "network-check-target-zsqnv" (UID: "2bca4834-ef71-4be6-ac75-bf2bc736877d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:52.548774 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:52.548690 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:52.548921 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:52.548838 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:53:52.549257 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:52.549239 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:52.549347 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:52.549329 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:53:54.548978 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:54.548940 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:54.549443 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:54.549088 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:53:54.549704 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:54.548938 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:54.549704 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:54.549657 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:53:56.194049 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:56.193962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:56.194549 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:56.194093 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:56.194549 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:56.194165 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs podName:865ada3d-5576-4faa-98c1-2b867558ffc0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:04.194144362 +0000 UTC m=+17.184548354 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs") pod "network-metrics-daemon-z72wj" (UID: "865ada3d-5576-4faa-98c1-2b867558ffc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:56.294714 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:56.294668 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktzp\" (UniqueName: \"kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp\") pod \"network-check-target-zsqnv\" (UID: \"2bca4834-ef71-4be6-ac75-bf2bc736877d\") " pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:56.294898 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:56.294838 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:56.294898 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:56.294861 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:56.294898 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:56.294876 2569 projected.go:194] Error preparing data for projected volume kube-api-access-6ktzp for pod openshift-network-diagnostics/network-check-target-zsqnv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:56.295059 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:56.294940 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp podName:2bca4834-ef71-4be6-ac75-bf2bc736877d nodeName:}" failed. No retries permitted until 2026-04-24 23:54:04.294920515 +0000 UTC m=+17.285324508 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktzp" (UniqueName: "kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp") pod "network-check-target-zsqnv" (UID: "2bca4834-ef71-4be6-ac75-bf2bc736877d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:56.549061 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:56.548973 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:56.549061 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:56.549025 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:56.549273 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:56.549126 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:53:56.549508 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:56.549464 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:53:58.548328 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:58.548292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:53:58.548724 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:58.548438 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:53:58.548724 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:53:58.548292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:53:58.548724 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:53:58.548655 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:00.548494 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:00.548459 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:00.548957 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:00.548500 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:00.548957 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:00.548591 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:00.548957 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:00.548731 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:02.548867 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:02.548821 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:02.548867 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:02.548849 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:02.549362 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:02.548973 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:02.549362 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:02.549107 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:04.247417 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:04.247355 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:04.248024 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:04.247509 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:04.248024 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:04.247597 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs podName:865ada3d-5576-4faa-98c1-2b867558ffc0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:20.247571398 +0000 UTC m=+33.237975389 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs") pod "network-metrics-daemon-z72wj" (UID: "865ada3d-5576-4faa-98c1-2b867558ffc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:04.347801 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:04.347755 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktzp\" (UniqueName: \"kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp\") pod \"network-check-target-zsqnv\" (UID: \"2bca4834-ef71-4be6-ac75-bf2bc736877d\") " pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:04.347975 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:04.347932 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:04.347975 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:04.347959 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:04.347975 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:04.347974 2569 projected.go:194] Error preparing data for projected volume kube-api-access-6ktzp for pod openshift-network-diagnostics/network-check-target-zsqnv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:04.348102 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:04.348041 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp podName:2bca4834-ef71-4be6-ac75-bf2bc736877d nodeName:}" failed. No retries permitted until 2026-04-24 23:54:20.348023582 +0000 UTC m=+33.338427579 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktzp" (UniqueName: "kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp") pod "network-check-target-zsqnv" (UID: "2bca4834-ef71-4be6-ac75-bf2bc736877d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:04.548397 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:04.548312 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:04.548551 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:04.548312 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:04.548551 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:04.548444 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:04.548551 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:04.548514 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:06.548339 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:06.548314 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:06.548628 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:06.548355 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:06.548628 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:06.548444 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:06.548628 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:06.548491 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:07.758834 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.758439 2569 generic.go:358] "Generic (PLEG): container finished" podID="b6448547fa43b3d7dc0fab15e5ba4e93" containerID="efd1e04a1943f468cabd471c75dc2e4aa00c7c2388d84bb2230c21351b3644ac" exitCode=0 Apr 24 23:54:07.758834 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.758534 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" event={"ID":"b6448547fa43b3d7dc0fab15e5ba4e93","Type":"ContainerDied","Data":"efd1e04a1943f468cabd471c75dc2e4aa00c7c2388d84bb2230c21351b3644ac"} Apr 24 23:54:07.763499 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.763479 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 24 23:54:07.763770 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.763748 2569 generic.go:358] "Generic (PLEG): container finished" podID="9c43aa69-d515-46b6-8b76-dd50b64985c6" containerID="a19d52707150c5b52a2a9e542b402200d9b8fe24ba5e9bfe112d7bccea740c07" exitCode=1 Apr 24 23:54:07.763818 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.763769 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" event={"ID":"9c43aa69-d515-46b6-8b76-dd50b64985c6","Type":"ContainerStarted","Data":"df0cd41c8aa7aaac902df982baa3746315b329fa7c8e0613c5986d3021260944"} Apr 24 23:54:07.763818 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.763797 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" event={"ID":"9c43aa69-d515-46b6-8b76-dd50b64985c6","Type":"ContainerStarted","Data":"b1f846249bdd7ac6457c1d2a49b76bea2fd3e7a2c771b6bc3a5bab100d7ba64e"} Apr 24 23:54:07.763818 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.763807 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" event={"ID":"9c43aa69-d515-46b6-8b76-dd50b64985c6","Type":"ContainerStarted","Data":"1f6b03c0368f21c1247031e8858bf99f1692ddab5bf00333fd5577595fbfd33e"} Apr 24 23:54:07.763818 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.763817 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" event={"ID":"9c43aa69-d515-46b6-8b76-dd50b64985c6","Type":"ContainerStarted","Data":"cdaf3f3f7b68d981da511603d181e9e214bda5b227adac67372a966de3601242"} Apr 24 23:54:07.763954 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.763825 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" event={"ID":"9c43aa69-d515-46b6-8b76-dd50b64985c6","Type":"ContainerDied","Data":"a19d52707150c5b52a2a9e542b402200d9b8fe24ba5e9bfe112d7bccea740c07"} Apr 24 23:54:07.763954 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.763834 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" event={"ID":"9c43aa69-d515-46b6-8b76-dd50b64985c6","Type":"ContainerStarted","Data":"8e084aaddc58149e705f45bc9bbf6021585afc330756195f3f31fad21228380c"} Apr 24 23:54:07.765077 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.765055 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c8t2m" event={"ID":"af315a28-630d-4b83-bf3b-2b3421fa929f","Type":"ContainerStarted","Data":"c4e704b42e21a25d8a059c84edd4ac8df89c268d6faa45d2ca46c10732a4773a"} Apr 24 23:54:07.766175 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.766154 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2skl8" event={"ID":"2f91e8cd-3f4c-4228-8c96-94fcc37e0124","Type":"ContainerStarted","Data":"7ce9d396b34402486277f749d2e4dc505670a221aa2d20f2506f90b02f42f668"} Apr 24 23:54:07.767240 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.767219 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f9b5d" event={"ID":"d49aa07c-0862-4d3d-85c1-e60a04019252","Type":"ContainerStarted","Data":"d0af7c490abd6c5e90acd48d30fe9a79a5c8a132f15d5bf317258c2f22548d1d"} Apr 24 23:54:07.768479 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.768452 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" event={"ID":"824d5c7b-4cdc-426b-9c01-0331c41c1293","Type":"ContainerStarted","Data":"6f7920743aebe0079ce714378abf41800f2c3e19f284587e2a4710f840f07cd1"} Apr 24 23:54:07.769759 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.769742 2569 generic.go:358] "Generic (PLEG): container finished" podID="bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8" containerID="7be57ab1b5db46f5430ccb62dc15133052648bb9f478c900619db0f2e68fccfe" exitCode=0 Apr 24 23:54:07.769844 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.769770 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5fs9" event={"ID":"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8","Type":"ContainerDied","Data":"7be57ab1b5db46f5430ccb62dc15133052648bb9f478c900619db0f2e68fccfe"} Apr 24 23:54:07.771141 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.771084 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" event={"ID":"273443fc-e308-4ad5-8ab7-60e95b67db82","Type":"ContainerStarted","Data":"59531aff244dcd31c323b76369ff216a55dae1d24d6aa3c5d9566c5c5e0762ff"} Apr 24 23:54:07.772318 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.772298 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zklf4" event={"ID":"8d604d02-f21f-4a15-84ae-2c0a50e8c899","Type":"ContainerStarted","Data":"1affb5350b8a48925099462785d9a9f0f39aa119000b24873ff81ee4c57331b2"} Apr 24 23:54:07.773448 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.773429 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-30.ec2.internal" event={"ID":"02bbe2448f51d29d628074b87ed0230a","Type":"ContainerStarted","Data":"e06fef24de7a4165f5434227239fa46c576bdcb4d84d2bd8a288f6a224f48678"} Apr 24 23:54:07.797211 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.797144 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zklf4" podStartSLOduration=3.056559995 podStartE2EDuration="20.79712721s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:48.763309268 +0000 UTC m=+1.753713258" lastFinishedPulling="2026-04-24 23:54:06.503876469 +0000 UTC m=+19.494280473" observedRunningTime="2026-04-24 23:54:07.797035596 +0000 UTC m=+20.787439609" watchObservedRunningTime="2026-04-24 23:54:07.79712721 +0000 UTC m=+20.787531223" Apr 24 23:54:07.841323 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.841264 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-30.ec2.internal" podStartSLOduration=19.841245518 podStartE2EDuration="19.841245518s" podCreationTimestamp="2026-04-24 23:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:07.840771651 +0000 UTC m=+20.831175659" watchObservedRunningTime="2026-04-24 23:54:07.841245518 +0000 UTC m=+20.831649530" Apr 24 23:54:07.856833 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.856781 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fmgbl" podStartSLOduration=3.102101882 podStartE2EDuration="20.856762476s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:48.778883231 +0000 UTC m=+1.769287223" lastFinishedPulling="2026-04-24 23:54:06.533543812 +0000 UTC m=+19.523947817" observedRunningTime="2026-04-24 23:54:07.855931046 +0000 UTC m=+20.846335072" watchObservedRunningTime="2026-04-24 23:54:07.856762476 +0000 UTC m=+20.847166491" Apr 24 23:54:07.884322 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.884272 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f9b5d" podStartSLOduration=3.14789022 podStartE2EDuration="20.884225689s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:48.794481348 +0000 UTC m=+1.784885339" lastFinishedPulling="2026-04-24 23:54:06.530816806 +0000 UTC m=+19.521220808" observedRunningTime="2026-04-24 23:54:07.870552002 +0000 UTC m=+20.860956015" watchObservedRunningTime="2026-04-24 23:54:07.884225689 +0000 UTC m=+20.874629697" Apr 24 23:54:07.884469 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.884396 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2skl8" podStartSLOduration=3.1627408089999998 podStartE2EDuration="20.884389126s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:48.809270738 +0000 UTC m=+1.799674728" lastFinishedPulling="2026-04-24 23:54:06.530919053 +0000 UTC m=+19.521323045" observedRunningTime="2026-04-24 23:54:07.884139641 +0000 UTC m=+20.874543656" watchObservedRunningTime="2026-04-24 23:54:07.884389126 +0000 UTC m=+20.874793136" Apr 24 23:54:07.903098 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:07.903042 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-c8t2m" podStartSLOduration=3.192867586 podStartE2EDuration="20.903024712s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:48.824206769 +0000 UTC m=+1.814610764" lastFinishedPulling="2026-04-24 23:54:06.534363892 +0000 UTC m=+19.524767890" observedRunningTime="2026-04-24 23:54:07.902636209 +0000 UTC m=+20.893040235" watchObservedRunningTime="2026-04-24 23:54:07.903024712 +0000 UTC m=+20.893428727" Apr 24 23:54:08.345643 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:08.345596 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 23:54:08.486041 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:08.485894 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T23:54:08.345638907Z","UUID":"c72786c9-fde9-4b9f-b146-49002194b04c","Handler":null,"Name":"","Endpoint":""} Apr 24 23:54:08.488165 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:08.488001 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 23:54:08.488165 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:08.488036 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 23:54:08.548896 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:08.548713 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:08.549065 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:08.548713 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:08.549065 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:08.548994 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:08.549177 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:08.549072 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:08.777876 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:08.777790 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" event={"ID":"273443fc-e308-4ad5-8ab7-60e95b67db82","Type":"ContainerStarted","Data":"f0bc0279abe97ffe2a9fdf93b9b6b15d422a088b2214f2c7ca1dfa3079c2cf75"} Apr 24 23:54:08.779382 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:08.779337 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b9sll" event={"ID":"ed715705-a5a4-4ee1-b874-58c1cb13ea71","Type":"ContainerStarted","Data":"53c626e9eb702f6e3ccb23d1482a6f4a4a138b00d3c7b603d20f631d43400af1"} Apr 24 23:54:08.782751 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:08.782718 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" event={"ID":"b6448547fa43b3d7dc0fab15e5ba4e93","Type":"ContainerStarted","Data":"e5112a918097eb3f3c05e616e837c2fb5fc8f9e5cf173c898eacfc9cb1f2c255"} Apr 24 23:54:08.818498 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:08.818449 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-b9sll" podStartSLOduration=4.114283025 podStartE2EDuration="21.818430038s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:48.830139678 +0000 UTC m=+1.820543669" lastFinishedPulling="2026-04-24 23:54:06.534286682 +0000 UTC m=+19.524690682" observedRunningTime="2026-04-24 23:54:08.798293065 +0000 UTC m=+21.788697078" watchObservedRunningTime="2026-04-24 23:54:08.818430038 +0000 UTC m=+21.808834052" Apr 24 23:54:08.818924 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:08.818889 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-30.ec2.internal" podStartSLOduration=20.818878544 podStartE2EDuration="20.818878544s" podCreationTimestamp="2026-04-24 23:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:08.81879708 +0000 UTC m=+21.809201093" watchObservedRunningTime="2026-04-24 23:54:08.818878544 +0000 UTC m=+21.809282561" Apr 24 23:54:09.758263 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:09.758223 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:54:09.758993 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:09.758971 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:54:09.786698 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:09.786672 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 24 23:54:09.787227 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:09.787055 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" event={"ID":"9c43aa69-d515-46b6-8b76-dd50b64985c6","Type":"ContainerStarted","Data":"8bb5e8117de429e26891ce96218e596edb7df1f5eeba722ac583f152bf2cd808"} Apr 24 23:54:09.789029 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:09.788959 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" event={"ID":"273443fc-e308-4ad5-8ab7-60e95b67db82","Type":"ContainerStarted","Data":"7ce017d0178f88d7d0d45cb4ace5e526db904930bfec47aa2e67c15f2476d9ac"} Apr 24 23:54:10.548906 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:10.548871 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:10.548906 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:10.548898 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:10.549129 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:10.548985 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:10.549129 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:10.549117 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:10.791000 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:10.790973 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:54:12.548524 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.548331 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:12.548996 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.548332 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:12.548996 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:12.548684 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:12.548996 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:12.548590 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:12.701135 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.701092 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:54:12.701304 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.701232 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:54:12.701800 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.701779 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zklf4" Apr 24 23:54:12.718671 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.718627 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62tl4" podStartSLOduration=5.459751515 podStartE2EDuration="25.718610194s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:48.836970733 +0000 UTC m=+1.827374727" lastFinishedPulling="2026-04-24 23:54:09.095829406 +0000 UTC m=+22.086233406" observedRunningTime="2026-04-24 23:54:09.809117703 +0000 UTC m=+22.799521732" watchObservedRunningTime="2026-04-24 23:54:12.718610194 +0000 UTC m=+25.709014292" Apr 24 23:54:12.798220 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.798190 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 24 23:54:12.798622 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.798561 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" event={"ID":"9c43aa69-d515-46b6-8b76-dd50b64985c6","Type":"ContainerStarted","Data":"ed8497e64e3d34d8a99d08b222531d6afde6796a02c9e5cf87841d7522e70495"} Apr 24 23:54:12.798888 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.798867 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:54:12.799106 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.799080 2569 scope.go:117] "RemoveContainer" containerID="a19d52707150c5b52a2a9e542b402200d9b8fe24ba5e9bfe112d7bccea740c07" Apr 24 23:54:12.800272 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.800250 2569 generic.go:358] "Generic (PLEG): container finished" podID="bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8" containerID="21995af1ecd7b7a0bdd39e5629f57496fd9807cafc2039679a9a628a36522d37" exitCode=0 Apr 24 23:54:12.800333 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.800308 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5fs9" event={"ID":"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8","Type":"ContainerDied","Data":"21995af1ecd7b7a0bdd39e5629f57496fd9807cafc2039679a9a628a36522d37"} Apr 24 23:54:12.814459 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:12.814435 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:54:13.804433 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:13.804331 2569 generic.go:358] "Generic (PLEG): container finished" podID="bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8" containerID="b11f558f81f5ff1f353dc498275d31fee5605b218ba70d1d0297c5a8aa19cc1a" exitCode=0 Apr 24 23:54:13.804833 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:13.804456 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5fs9" event={"ID":"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8","Type":"ContainerDied","Data":"b11f558f81f5ff1f353dc498275d31fee5605b218ba70d1d0297c5a8aa19cc1a"} Apr 24 23:54:13.807809 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:13.807790 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 24 23:54:13.808126 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:13.808106 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" event={"ID":"9c43aa69-d515-46b6-8b76-dd50b64985c6","Type":"ContainerStarted","Data":"aacdb8660d8b5b984fcc1da90e7eb8d3496aa7ebd3845b957da015107b1f5a2f"} Apr 24 23:54:13.808419 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:13.808402 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:54:13.808511 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:13.808425 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:54:13.822237 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:13.822213 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:54:13.852963 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:13.852909 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" podStartSLOduration=8.868683793 podStartE2EDuration="26.852894819s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:48.875996708 +0000 UTC m=+1.866400699" lastFinishedPulling="2026-04-24 23:54:06.860207731 +0000 UTC m=+19.850611725" observedRunningTime="2026-04-24 23:54:13.852683844 +0000 UTC m=+26.843087856" watchObservedRunningTime="2026-04-24 23:54:13.852894819 +0000 UTC m=+26.843298832" Apr 24 23:54:14.548771 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:14.548740 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:14.548924 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:14.548740 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:14.548924 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:14.548840 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:14.548924 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:14.548913 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:14.811574 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:14.811529 2569 generic.go:358] "Generic (PLEG): container finished" podID="bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8" containerID="ddc95928303398a5826153c5a7451707c72c15f859b0d2aee8aa172a060555a9" exitCode=0 Apr 24 23:54:14.811922 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:14.811600 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5fs9" event={"ID":"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8","Type":"ContainerDied","Data":"ddc95928303398a5826153c5a7451707c72c15f859b0d2aee8aa172a060555a9"} Apr 24 23:54:16.548646 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:16.548611 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:16.549072 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:16.548725 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:16.549072 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:16.548788 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:16.549072 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:16.548919 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:18.548498 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:18.548462 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:18.548891 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:18.548462 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:18.548891 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:18.548568 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:18.548891 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:18.548645 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:20.265260 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:20.265226 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:20.265720 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:20.265344 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:20.265720 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:20.265413 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs podName:865ada3d-5576-4faa-98c1-2b867558ffc0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:52.265395886 +0000 UTC m=+65.255799878 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs") pod "network-metrics-daemon-z72wj" (UID: "865ada3d-5576-4faa-98c1-2b867558ffc0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:20.366509 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:20.366480 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktzp\" (UniqueName: \"kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp\") pod \"network-check-target-zsqnv\" (UID: \"2bca4834-ef71-4be6-ac75-bf2bc736877d\") " pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:20.366667 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:20.366606 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:20.366667 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:20.366620 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:20.366667 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:20.366628 2569 projected.go:194] Error preparing data for projected volume kube-api-access-6ktzp for pod openshift-network-diagnostics/network-check-target-zsqnv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:20.366774 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:20.366673 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp podName:2bca4834-ef71-4be6-ac75-bf2bc736877d nodeName:}" failed. No retries permitted until 2026-04-24 23:54:52.36666054 +0000 UTC m=+65.357064530 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6ktzp" (UniqueName: "kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp") pod "network-check-target-zsqnv" (UID: "2bca4834-ef71-4be6-ac75-bf2bc736877d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:20.548526 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:20.548499 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:20.548656 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:20.548499 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:20.548656 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:20.548621 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:20.548767 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:20.548674 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:20.825453 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:20.825420 2569 generic.go:358] "Generic (PLEG): container finished" podID="bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8" containerID="7559e5f4386a94a2b60bb8bb9fb6d0b24007b358331ebf616734e4730051c2e0" exitCode=0 Apr 24 23:54:20.825579 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:20.825484 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5fs9" event={"ID":"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8","Type":"ContainerDied","Data":"7559e5f4386a94a2b60bb8bb9fb6d0b24007b358331ebf616734e4730051c2e0"} Apr 24 23:54:21.829433 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:21.829391 2569 generic.go:358] "Generic (PLEG): container finished" podID="bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8" containerID="236b3e6810c97522b664574308181ac323f5f11dfad25e664c9e9789d06188ca" exitCode=0 Apr 24 23:54:21.829958 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:21.829449 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5fs9" event={"ID":"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8","Type":"ContainerDied","Data":"236b3e6810c97522b664574308181ac323f5f11dfad25e664c9e9789d06188ca"} Apr 24 23:54:22.548904 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:22.548862 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:22.549098 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:22.548862 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:22.549098 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:22.548974 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:22.549098 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:22.549069 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:22.834468 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:22.834395 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5fs9" event={"ID":"bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8","Type":"ContainerStarted","Data":"ebaa9e8e2f70839699ab3d005a1051dd7017b32ad6b5de3fb103743917816315"} Apr 24 23:54:22.855192 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:22.855151 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p5fs9" podStartSLOduration=4.226032562 podStartE2EDuration="35.855138851s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:48.872503506 +0000 UTC m=+1.862907496" lastFinishedPulling="2026-04-24 23:54:20.501609794 +0000 UTC m=+33.492013785" observedRunningTime="2026-04-24 23:54:22.854809168 +0000 UTC m=+35.845213180" watchObservedRunningTime="2026-04-24 23:54:22.855138851 +0000 UTC m=+35.845542859" Apr 24 23:54:24.548835 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:24.548797 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:24.549205 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:24.548796 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:24.549205 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:24.548914 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:24.549205 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:24.548981 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:26.548929 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:26.548895 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:26.549295 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:26.548894 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:26.549295 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:26.549000 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:26.549295 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:26.549066 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:28.548292 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:28.548265 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:28.548855 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:28.548265 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:28.548855 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:28.548355 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:28.548855 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:28.548449 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:30.548627 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:30.548591 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:30.549018 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:30.548591 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:30.549018 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:30.548700 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:30.549018 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:30.548763 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:32.097455 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:32.097190 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z72wj"] Apr 24 23:54:32.098045 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:32.097555 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:32.098045 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:32.097704 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:32.099661 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:32.099639 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zsqnv"] Apr 24 23:54:32.099768 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:32.099712 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:32.099812 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:32.099786 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:33.548613 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:33.548571 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:33.548985 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:33.548571 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:33.548985 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:33.548695 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:33.548985 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:33.548758 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:35.548915 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.548876 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:35.549363 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:35.548985 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zsqnv" podUID="2bca4834-ef71-4be6-ac75-bf2bc736877d" Apr 24 23:54:35.549363 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.548878 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:35.549363 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:35.549060 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z72wj" podUID="865ada3d-5576-4faa-98c1-2b867558ffc0" Apr 24 23:54:35.855122 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.855038 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-30.ec2.internal" event="NodeReady" Apr 24 23:54:35.855289 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.855180 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 23:54:35.901356 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.901325 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fcfzm"] Apr 24 23:54:35.927405 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.927355 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qm5m4"] Apr 24 23:54:35.927571 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.927523 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:35.930608 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.930582 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 23:54:35.930608 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.930605 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xgj74\"" Apr 24 23:54:35.930804 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.930751 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 23:54:35.940489 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.940467 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fcfzm"] Apr 24 23:54:35.940587 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.940493 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qm5m4"] Apr 24 23:54:35.940587 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.940578 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qm5m4" Apr 24 23:54:35.945288 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.944215 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nb5d2\"" Apr 24 23:54:35.945288 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.944548 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 23:54:35.945288 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.944854 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 23:54:35.945288 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:35.945235 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 23:54:36.086197 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.086157 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-config-volume\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.086440 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.086204 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp85c\" (UniqueName: \"kubernetes.io/projected/5dbdb447-15bf-4e90-abe6-da3abc588e4a-kube-api-access-fp85c\") pod \"ingress-canary-qm5m4\" (UID: \"5dbdb447-15bf-4e90-abe6-da3abc588e4a\") " pod="openshift-ingress-canary/ingress-canary-qm5m4" Apr 24 23:54:36.086440 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.086247 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dbdb447-15bf-4e90-abe6-da3abc588e4a-cert\") pod \"ingress-canary-qm5m4\" (UID: \"5dbdb447-15bf-4e90-abe6-da3abc588e4a\") " pod="openshift-ingress-canary/ingress-canary-qm5m4" Apr 24 23:54:36.086440 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.086289 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvlwb\" (UniqueName: \"kubernetes.io/projected/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-kube-api-access-fvlwb\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.086440 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.086337 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-metrics-tls\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.086440 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.086359 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-tmp-dir\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.187114 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.187076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-config-volume\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.187114 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.187115 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fp85c\" (UniqueName: \"kubernetes.io/projected/5dbdb447-15bf-4e90-abe6-da3abc588e4a-kube-api-access-fp85c\") pod \"ingress-canary-qm5m4\" (UID: \"5dbdb447-15bf-4e90-abe6-da3abc588e4a\") " pod="openshift-ingress-canary/ingress-canary-qm5m4" Apr 24 23:54:36.187321 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.187157 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dbdb447-15bf-4e90-abe6-da3abc588e4a-cert\") pod \"ingress-canary-qm5m4\" (UID: \"5dbdb447-15bf-4e90-abe6-da3abc588e4a\") " pod="openshift-ingress-canary/ingress-canary-qm5m4" Apr 24 23:54:36.187321 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.187194 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvlwb\" (UniqueName: \"kubernetes.io/projected/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-kube-api-access-fvlwb\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.187321 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.187232 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-metrics-tls\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.187321 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.187254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-tmp-dir\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.187592 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.187571 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-tmp-dir\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.187766 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.187748 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-config-volume\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.191222 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.191196 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-metrics-tls\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.191331 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.191314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dbdb447-15bf-4e90-abe6-da3abc588e4a-cert\") pod \"ingress-canary-qm5m4\" (UID: \"5dbdb447-15bf-4e90-abe6-da3abc588e4a\") " pod="openshift-ingress-canary/ingress-canary-qm5m4" Apr 24 23:54:36.202808 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.202784 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvlwb\" (UniqueName: \"kubernetes.io/projected/e80ea08a-a6a9-4898-a4aa-e2a17c1e990f-kube-api-access-fvlwb\") pod \"dns-default-fcfzm\" (UID: \"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f\") " pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.202922 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.202876 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp85c\" (UniqueName: \"kubernetes.io/projected/5dbdb447-15bf-4e90-abe6-da3abc588e4a-kube-api-access-fp85c\") pod \"ingress-canary-qm5m4\" (UID: \"5dbdb447-15bf-4e90-abe6-da3abc588e4a\") " pod="openshift-ingress-canary/ingress-canary-qm5m4" Apr 24 23:54:36.236722 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.236687 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:36.250440 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.250418 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qm5m4" Apr 24 23:54:36.388812 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.388553 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fcfzm"] Apr 24 23:54:36.392140 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.392098 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qm5m4"] Apr 24 23:54:36.395145 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:36.395125 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dbdb447_15bf_4e90_abe6_da3abc588e4a.slice/crio-b75326e826f5ab2ac5de722808b76a42329c5fb30e1b009ae6c6fc8dfa9c6248 WatchSource:0}: Error finding container b75326e826f5ab2ac5de722808b76a42329c5fb30e1b009ae6c6fc8dfa9c6248: Status 404 returned error can't find the container with id b75326e826f5ab2ac5de722808b76a42329c5fb30e1b009ae6c6fc8dfa9c6248 Apr 24 23:54:36.443360 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.443294 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t"] Apr 24 23:54:36.464967 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.464944 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t"] Apr 24 23:54:36.465091 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.465042 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t" Apr 24 23:54:36.467471 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.467448 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-clz55\"" Apr 24 23:54:36.467592 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.467502 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 23:54:36.590174 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.590140 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4e0b59e8-7176-4cd4-90c7-cef0e7fe3b4f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5sb5t\" (UID: \"4e0b59e8-7176-4cd4-90c7-cef0e7fe3b4f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t" Apr 24 23:54:36.691052 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.691023 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4e0b59e8-7176-4cd4-90c7-cef0e7fe3b4f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5sb5t\" (UID: \"4e0b59e8-7176-4cd4-90c7-cef0e7fe3b4f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t" Apr 24 23:54:36.694191 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.694136 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4e0b59e8-7176-4cd4-90c7-cef0e7fe3b4f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5sb5t\" (UID: \"4e0b59e8-7176-4cd4-90c7-cef0e7fe3b4f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t" Apr 24 23:54:36.774080 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.774045 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t" Apr 24 23:54:36.862898 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.862833 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qm5m4" event={"ID":"5dbdb447-15bf-4e90-abe6-da3abc588e4a","Type":"ContainerStarted","Data":"b75326e826f5ab2ac5de722808b76a42329c5fb30e1b009ae6c6fc8dfa9c6248"} Apr 24 23:54:36.864346 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.864317 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fcfzm" event={"ID":"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f","Type":"ContainerStarted","Data":"864e5f177c030f7237f77c10adced6c0f936f8de5f073425a0f1917f918d5807"} Apr 24 23:54:36.918549 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.918020 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t"] Apr 24 23:54:36.922495 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:36.922461 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e0b59e8_7176_4cd4_90c7_cef0e7fe3b4f.slice/crio-28a482fe329eafc699d771f36a3b393dbbc787172afd02d8117b52a0490a49ca WatchSource:0}: Error finding container 28a482fe329eafc699d771f36a3b393dbbc787172afd02d8117b52a0490a49ca: Status 404 returned error can't find the container with id 28a482fe329eafc699d771f36a3b393dbbc787172afd02d8117b52a0490a49ca Apr 24 23:54:36.945851 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.945789 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv"] Apr 24 23:54:36.964429 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.964386 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv"] Apr 24 23:54:36.964570 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.964536 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv" Apr 24 23:54:36.967476 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.967232 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kdsm2\"" Apr 24 23:54:36.967476 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.967312 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 23:54:36.967476 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:36.967419 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:37.094154 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.094117 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpbxc\" (UniqueName: \"kubernetes.io/projected/b323d79b-386c-4e79-91fe-2cdaf82ab912-kube-api-access-lpbxc\") pod \"migrator-74bb7799d9-59slv\" (UID: \"b323d79b-386c-4e79-91fe-2cdaf82ab912\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv" Apr 24 23:54:37.194645 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.194603 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpbxc\" (UniqueName: \"kubernetes.io/projected/b323d79b-386c-4e79-91fe-2cdaf82ab912-kube-api-access-lpbxc\") pod \"migrator-74bb7799d9-59slv\" (UID: \"b323d79b-386c-4e79-91fe-2cdaf82ab912\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv" Apr 24 23:54:37.203686 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.203580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpbxc\" (UniqueName: \"kubernetes.io/projected/b323d79b-386c-4e79-91fe-2cdaf82ab912-kube-api-access-lpbxc\") pod \"migrator-74bb7799d9-59slv\" (UID: \"b323d79b-386c-4e79-91fe-2cdaf82ab912\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv" Apr 24 23:54:37.289103 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.289070 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv" Apr 24 23:54:37.423887 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.423850 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-87zhf"] Apr 24 23:54:37.434999 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:37.434910 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb323d79b_386c_4e79_91fe_2cdaf82ab912.slice/crio-d022d250d3ace1a4744c31af7ab544b66c1a506d2ad8a243adf1e5d6204aa221 WatchSource:0}: Error finding container d022d250d3ace1a4744c31af7ab544b66c1a506d2ad8a243adf1e5d6204aa221: Status 404 returned error can't find the container with id d022d250d3ace1a4744c31af7ab544b66c1a506d2ad8a243adf1e5d6204aa221 Apr 24 23:54:37.437634 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.437428 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv"] Apr 24 23:54:37.437634 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.437479 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-87zhf"] Apr 24 23:54:37.437634 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.437514 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-87zhf" Apr 24 23:54:37.441274 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.441252 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 23:54:37.441428 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.441276 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 23:54:37.441727 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.441707 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-74765\"" Apr 24 23:54:37.575477 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.575395 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:37.575477 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.575431 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:37.577934 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.577909 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ghb4v\"" Apr 24 23:54:37.578072 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.577946 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:37.578072 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.577998 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h8trl\"" Apr 24 23:54:37.578072 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.577909 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:37.587490 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.587286 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:37.597785 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.597719 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpmw6\" (UniqueName: \"kubernetes.io/projected/ce25c16f-218e-4c97-a5d9-fc6cb1293ba6-kube-api-access-vpmw6\") pod \"downloads-6bcc868b7-87zhf\" (UID: \"ce25c16f-218e-4c97-a5d9-fc6cb1293ba6\") " pod="openshift-console/downloads-6bcc868b7-87zhf" Apr 24 23:54:37.698443 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.698410 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpmw6\" (UniqueName: \"kubernetes.io/projected/ce25c16f-218e-4c97-a5d9-fc6cb1293ba6-kube-api-access-vpmw6\") pod \"downloads-6bcc868b7-87zhf\" (UID: \"ce25c16f-218e-4c97-a5d9-fc6cb1293ba6\") " pod="openshift-console/downloads-6bcc868b7-87zhf" Apr 24 23:54:37.710384 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.710344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpmw6\" (UniqueName: \"kubernetes.io/projected/ce25c16f-218e-4c97-a5d9-fc6cb1293ba6-kube-api-access-vpmw6\") pod \"downloads-6bcc868b7-87zhf\" (UID: \"ce25c16f-218e-4c97-a5d9-fc6cb1293ba6\") " pod="openshift-console/downloads-6bcc868b7-87zhf" Apr 24 23:54:37.749925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.749887 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-87zhf" Apr 24 23:54:37.868344 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.868293 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv" event={"ID":"b323d79b-386c-4e79-91fe-2cdaf82ab912","Type":"ContainerStarted","Data":"d022d250d3ace1a4744c31af7ab544b66c1a506d2ad8a243adf1e5d6204aa221"} Apr 24 23:54:37.869601 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:37.869561 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t" event={"ID":"4e0b59e8-7176-4cd4-90c7-cef0e7fe3b4f","Type":"ContainerStarted","Data":"28a482fe329eafc699d771f36a3b393dbbc787172afd02d8117b52a0490a49ca"} Apr 24 23:54:38.722498 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:38.722467 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f9b5d_d49aa07c-0862-4d3d-85c1-e60a04019252/dns-node-resolver/0.log" Apr 24 23:54:39.167113 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.166870 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-87zhf"] Apr 24 23:54:39.172871 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:39.172816 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce25c16f_218e_4c97_a5d9_fc6cb1293ba6.slice/crio-f1361c59828e3da46d2eefd4dd3c6d25c141e5f3a82b0d25e40b7ab51eaf0ab2 WatchSource:0}: Error finding container f1361c59828e3da46d2eefd4dd3c6d25c141e5f3a82b0d25e40b7ab51eaf0ab2: Status 404 returned error can't find the container with id f1361c59828e3da46d2eefd4dd3c6d25c141e5f3a82b0d25e40b7ab51eaf0ab2 Apr 24 23:54:39.522758 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.522686 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2skl8_2f91e8cd-3f4c-4228-8c96-94fcc37e0124/node-ca/0.log" Apr 24 23:54:39.877353 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.877311 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv" event={"ID":"b323d79b-386c-4e79-91fe-2cdaf82ab912","Type":"ContainerStarted","Data":"5ec19f7d5a9fbdc648a92e9eda4015f2b553c5cdbc51cf142f1cc671992d74d8"} Apr 24 23:54:39.877903 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.877357 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv" event={"ID":"b323d79b-386c-4e79-91fe-2cdaf82ab912","Type":"ContainerStarted","Data":"ab44571738d279008b0e9653c1b918a432803931ee59626fef337c80739b5756"} Apr 24 23:54:39.878855 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.878820 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t" event={"ID":"4e0b59e8-7176-4cd4-90c7-cef0e7fe3b4f","Type":"ContainerStarted","Data":"520db876409270e5e5bcadb40918b43bce25b6c192ffcb7a9c55b0fbd229d06d"} Apr 24 23:54:39.879056 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.879036 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t" Apr 24 23:54:39.880076 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.880042 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-87zhf" event={"ID":"ce25c16f-218e-4c97-a5d9-fc6cb1293ba6","Type":"ContainerStarted","Data":"f1361c59828e3da46d2eefd4dd3c6d25c141e5f3a82b0d25e40b7ab51eaf0ab2"} Apr 24 23:54:39.881754 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.881731 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qm5m4" event={"ID":"5dbdb447-15bf-4e90-abe6-da3abc588e4a","Type":"ContainerStarted","Data":"de9f4a6bcbdb3349dd4f9fdc0062ab02dfe27e2d0ca4186f3623a62966ce821b"} Apr 24 23:54:39.883645 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.883621 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fcfzm" event={"ID":"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f","Type":"ContainerStarted","Data":"7e9741fb7728633db1cb991958e32a161b931e8dda96a4c06a23af9fda067bec"} Apr 24 23:54:39.883743 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.883653 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fcfzm" event={"ID":"e80ea08a-a6a9-4898-a4aa-e2a17c1e990f","Type":"ContainerStarted","Data":"908710950ac0fb7b2f685e2b26b8d651129069dae59605191dde7f917c33bd29"} Apr 24 23:54:39.883813 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.883765 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:39.884719 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.884686 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t" Apr 24 23:54:39.894484 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.894444 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-59slv" podStartSLOduration=2.31034808 podStartE2EDuration="3.894432292s" podCreationTimestamp="2026-04-24 23:54:36 +0000 UTC" firstStartedPulling="2026-04-24 23:54:37.43785328 +0000 UTC m=+50.428257286" lastFinishedPulling="2026-04-24 23:54:39.021937501 +0000 UTC m=+52.012341498" observedRunningTime="2026-04-24 23:54:39.893531783 +0000 UTC m=+52.883935797" watchObservedRunningTime="2026-04-24 23:54:39.894432292 +0000 UTC m=+52.884836305" Apr 24 23:54:39.910488 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.910441 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qm5m4" podStartSLOduration=2.289503382 podStartE2EDuration="4.910424581s" podCreationTimestamp="2026-04-24 23:54:35 +0000 UTC" firstStartedPulling="2026-04-24 23:54:36.397056456 +0000 UTC m=+49.387460449" lastFinishedPulling="2026-04-24 23:54:39.017977645 +0000 UTC m=+52.008381648" observedRunningTime="2026-04-24 23:54:39.909638968 +0000 UTC m=+52.900042982" watchObservedRunningTime="2026-04-24 23:54:39.910424581 +0000 UTC m=+52.900828590" Apr 24 23:54:39.927541 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.926583 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fcfzm" podStartSLOduration=2.30413033 podStartE2EDuration="4.926566149s" podCreationTimestamp="2026-04-24 23:54:35 +0000 UTC" firstStartedPulling="2026-04-24 23:54:36.395540618 +0000 UTC m=+49.385944609" lastFinishedPulling="2026-04-24 23:54:39.017976422 +0000 UTC m=+52.008380428" observedRunningTime="2026-04-24 23:54:39.926393266 +0000 UTC m=+52.916797312" watchObservedRunningTime="2026-04-24 23:54:39.926566149 +0000 UTC m=+52.916970143" Apr 24 23:54:39.944654 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:39.944598 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5sb5t" podStartSLOduration=1.85019347 podStartE2EDuration="3.944581799s" podCreationTimestamp="2026-04-24 23:54:36 +0000 UTC" firstStartedPulling="2026-04-24 23:54:36.924997294 +0000 UTC m=+49.915401287" lastFinishedPulling="2026-04-24 23:54:39.019385626 +0000 UTC m=+52.009789616" observedRunningTime="2026-04-24 23:54:39.943731068 +0000 UTC m=+52.934135117" watchObservedRunningTime="2026-04-24 23:54:39.944581799 +0000 UTC m=+52.934985812" Apr 24 23:54:40.511449 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.511409 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9z2ng"] Apr 24 23:54:40.514532 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.514509 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.517832 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.517585 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 23:54:40.517832 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.517642 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 23:54:40.517832 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.517677 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 23:54:40.517832 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.517803 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 23:54:40.517832 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.517830 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 23:54:40.518144 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.517590 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-k5gb5\"" Apr 24 23:54:40.525233 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.525213 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9z2ng"] Apr 24 23:54:40.526101 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.525756 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qm5m4_5dbdb447-15bf-4e90-abe6-da3abc588e4a/serve-healthcheck-canary/0.log" Apr 24 23:54:40.619417 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.619349 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frbl6\" (UniqueName: \"kubernetes.io/projected/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-kube-api-access-frbl6\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.619593 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.619434 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.619593 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.619478 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.619593 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.619571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.720124 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.720081 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.720299 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.720143 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.720299 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.720185 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.720299 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.720237 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frbl6\" (UniqueName: \"kubernetes.io/projected/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-kube-api-access-frbl6\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.721122 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.721073 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.724758 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.724733 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.724758 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.724750 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.727719 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.727697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frbl6\" (UniqueName: \"kubernetes.io/projected/ac737ca1-d4f9-4109-9912-5ed5c7876fb4-kube-api-access-frbl6\") pod \"prometheus-operator-5676c8c784-9z2ng\" (UID: \"ac737ca1-d4f9-4109-9912-5ed5c7876fb4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.825792 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.825712 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" Apr 24 23:54:40.960213 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:40.960182 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9z2ng"] Apr 24 23:54:40.964301 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:40.964273 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac737ca1_d4f9_4109_9912_5ed5c7876fb4.slice/crio-7ae6daa924738812ea711a606b4e8b4131fbc188198b5aaf54ce7158e6573b91 WatchSource:0}: Error finding container 7ae6daa924738812ea711a606b4e8b4131fbc188198b5aaf54ce7158e6573b91: Status 404 returned error can't find the container with id 7ae6daa924738812ea711a606b4e8b4131fbc188198b5aaf54ce7158e6573b91 Apr 24 23:54:41.893209 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:41.893156 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" event={"ID":"ac737ca1-d4f9-4109-9912-5ed5c7876fb4","Type":"ContainerStarted","Data":"7ae6daa924738812ea711a606b4e8b4131fbc188198b5aaf54ce7158e6573b91"} Apr 24 23:54:42.898453 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:42.898413 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" event={"ID":"ac737ca1-d4f9-4109-9912-5ed5c7876fb4","Type":"ContainerStarted","Data":"23651fd7e4bba23cb6c6dde4a992773042101912d24e2ef352c92f6ebe032b51"} Apr 24 23:54:42.898453 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:42.898461 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" event={"ID":"ac737ca1-d4f9-4109-9912-5ed5c7876fb4","Type":"ContainerStarted","Data":"988194a3b6ad38d12cb9855629e5aa6cc818295a30cd3b698dc096a13ee3538a"} Apr 24 23:54:42.916094 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:42.916031 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-9z2ng" podStartSLOduration=1.5460218019999998 podStartE2EDuration="2.916017731s" podCreationTimestamp="2026-04-24 23:54:40 +0000 UTC" firstStartedPulling="2026-04-24 23:54:40.966780808 +0000 UTC m=+53.957184799" lastFinishedPulling="2026-04-24 23:54:42.336776733 +0000 UTC m=+55.327180728" observedRunningTime="2026-04-24 23:54:42.914906325 +0000 UTC m=+55.905310339" watchObservedRunningTime="2026-04-24 23:54:42.916017731 +0000 UTC m=+55.906421743" Apr 24 23:54:44.873756 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.873722 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9"] Apr 24 23:54:44.912508 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.912398 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xb82g"] Apr 24 23:54:44.912923 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.912691 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:44.915053 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.915024 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qs65v\"" Apr 24 23:54:44.915557 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.915528 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 23:54:44.915672 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.915562 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 23:54:44.927264 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.927243 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9"] Apr 24 23:54:44.927404 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.927287 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xb82g"] Apr 24 23:54:44.927404 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.927295 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:44.927404 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.927303 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ddw4g"] Apr 24 23:54:44.929790 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.929706 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 23:54:44.929889 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.929811 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 23:54:44.929991 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.929975 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-n4zmm\"" Apr 24 23:54:44.930161 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.930137 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 23:54:44.933151 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.933129 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:44.935302 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.935141 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 23:54:44.935497 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.935301 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 23:54:44.935676 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.935521 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 23:54:44.935802 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.935781 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-m6xfv\"" Apr 24 23:54:44.945547 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945521 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlp8n\" (UniqueName: \"kubernetes.io/projected/ea7d009e-6127-4f50-8b69-48fd33a90311-kube-api-access-mlp8n\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:44.945660 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945563 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:44.945660 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945585 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-wtmp\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:44.945660 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945631 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd8e0c2d-19dc-4d83-a697-7eca0069d620-sys\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:44.945822 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945670 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea7d009e-6127-4f50-8b69-48fd33a90311-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:44.945822 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945700 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd8e0c2d-19dc-4d83-a697-7eca0069d620-metrics-client-ca\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:44.945822 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945728 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/94089fb1-ab0c-4039-8198-ad6fb5562f6a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:44.945822 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945762 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:44.946022 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945825 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x455k\" (UniqueName: \"kubernetes.io/projected/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-api-access-x455k\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:44.946022 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945894 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:44.946022 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945931 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smmg5\" (UniqueName: \"kubernetes.io/projected/dd8e0c2d-19dc-4d83-a697-7eca0069d620-kube-api-access-smmg5\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:44.946022 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945959 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea7d009e-6127-4f50-8b69-48fd33a90311-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:44.946022 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.945987 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-textfile\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:44.946444 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.946063 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd8e0c2d-19dc-4d83-a697-7eca0069d620-root\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:44.946444 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.946104 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-accelerators-collector-config\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:44.946444 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.946153 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea7d009e-6127-4f50-8b69-48fd33a90311-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:44.946444 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.946173 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-tls\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:44.946444 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.946190 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94089fb1-ab0c-4039-8198-ad6fb5562f6a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:44.946444 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:44.946228 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.046688 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.046646 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea7d009e-6127-4f50-8b69-48fd33a90311-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:45.046863 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.046703 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd8e0c2d-19dc-4d83-a697-7eca0069d620-metrics-client-ca\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.046863 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.046736 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/94089fb1-ab0c-4039-8198-ad6fb5562f6a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.046863 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.046773 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.046863 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.046820 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x455k\" (UniqueName: \"kubernetes.io/projected/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-api-access-x455k\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.046863 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.046851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.047114 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.046880 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smmg5\" (UniqueName: \"kubernetes.io/projected/dd8e0c2d-19dc-4d83-a697-7eca0069d620-kube-api-access-smmg5\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.047114 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.046907 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea7d009e-6127-4f50-8b69-48fd33a90311-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:45.047114 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.046936 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-textfile\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.047114 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.046985 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd8e0c2d-19dc-4d83-a697-7eca0069d620-root\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.047114 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-accelerators-collector-config\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.047114 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047047 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea7d009e-6127-4f50-8b69-48fd33a90311-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:45.047114 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047071 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-tls\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.047114 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047092 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94089fb1-ab0c-4039-8198-ad6fb5562f6a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.047549 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047120 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.047549 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlp8n\" (UniqueName: \"kubernetes.io/projected/ea7d009e-6127-4f50-8b69-48fd33a90311-kube-api-access-mlp8n\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:45.047549 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/94089fb1-ab0c-4039-8198-ad6fb5562f6a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.047549 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047177 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.047549 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:45.047334 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 23:54:45.047549 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd8e0c2d-19dc-4d83-a697-7eca0069d620-metrics-client-ca\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.047549 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:45.047414 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-tls podName:dd8e0c2d-19dc-4d83-a697-7eca0069d620 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:45.547393682 +0000 UTC m=+58.537797689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-tls") pod "node-exporter-ddw4g" (UID: "dd8e0c2d-19dc-4d83-a697-7eca0069d620") : secret "node-exporter-tls" not found Apr 24 23:54:45.047549 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047456 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-textfile\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.047549 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047163 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd8e0c2d-19dc-4d83-a697-7eca0069d620-root\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.047896 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.047835 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-accelerators-collector-config\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.048788 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.048406 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea7d009e-6127-4f50-8b69-48fd33a90311-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:45.048788 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.048483 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-wtmp\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.048788 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.048618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-wtmp\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.048788 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.048668 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd8e0c2d-19dc-4d83-a697-7eca0069d620-sys\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.048788 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.048735 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd8e0c2d-19dc-4d83-a697-7eca0069d620-sys\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.049224 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.048821 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94089fb1-ab0c-4039-8198-ad6fb5562f6a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.049224 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.048877 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.049928 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.049900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.050059 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.050036 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.050151 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.050122 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea7d009e-6127-4f50-8b69-48fd33a90311-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:45.050261 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.050189 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea7d009e-6127-4f50-8b69-48fd33a90311-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:45.050915 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.050890 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.060385 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.060343 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smmg5\" (UniqueName: \"kubernetes.io/projected/dd8e0c2d-19dc-4d83-a697-7eca0069d620-kube-api-access-smmg5\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.061994 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.061972 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlp8n\" (UniqueName: \"kubernetes.io/projected/ea7d009e-6127-4f50-8b69-48fd33a90311-kube-api-access-mlp8n\") pod \"openshift-state-metrics-9d44df66c-hdcq9\" (UID: \"ea7d009e-6127-4f50-8b69-48fd33a90311\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:45.062076 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.061972 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x455k\" (UniqueName: \"kubernetes.io/projected/94089fb1-ab0c-4039-8198-ad6fb5562f6a-kube-api-access-x455k\") pod \"kube-state-metrics-69db897b98-xb82g\" (UID: \"94089fb1-ab0c-4039-8198-ad6fb5562f6a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.224677 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.224570 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" Apr 24 23:54:45.239428 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.239388 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" Apr 24 23:54:45.375976 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.375869 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9"] Apr 24 23:54:45.379334 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:45.379301 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea7d009e_6127_4f50_8b69_48fd33a90311.slice/crio-b42c8ee3aa92a36513fd23f403aa4e2e38f9491b504da45a99945fabaad861a2 WatchSource:0}: Error finding container b42c8ee3aa92a36513fd23f403aa4e2e38f9491b504da45a99945fabaad861a2: Status 404 returned error can't find the container with id b42c8ee3aa92a36513fd23f403aa4e2e38f9491b504da45a99945fabaad861a2 Apr 24 23:54:45.551348 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.551311 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-tls\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.554225 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.554182 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd8e0c2d-19dc-4d83-a697-7eca0069d620-node-exporter-tls\") pod \"node-exporter-ddw4g\" (UID: \"dd8e0c2d-19dc-4d83-a697-7eca0069d620\") " pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.599636 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.599602 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xb82g"] Apr 24 23:54:45.603792 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:45.603764 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94089fb1_ab0c_4039_8198_ad6fb5562f6a.slice/crio-0341c86e1c02716460b08fd42f00668687d69c63f466a84be6a21e3a603ef79d WatchSource:0}: Error finding container 0341c86e1c02716460b08fd42f00668687d69c63f466a84be6a21e3a603ef79d: Status 404 returned error can't find the container with id 0341c86e1c02716460b08fd42f00668687d69c63f466a84be6a21e3a603ef79d Apr 24 23:54:45.827014 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.826929 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v9qcn" Apr 24 23:54:45.845592 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.845563 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ddw4g" Apr 24 23:54:45.855573 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:45.855549 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd8e0c2d_19dc_4d83_a697_7eca0069d620.slice/crio-08dd718ca184971c1d9c6c2e8843dc84b39c853a1f510d350d4e0c50ee9481ad WatchSource:0}: Error finding container 08dd718ca184971c1d9c6c2e8843dc84b39c853a1f510d350d4e0c50ee9481ad: Status 404 returned error can't find the container with id 08dd718ca184971c1d9c6c2e8843dc84b39c853a1f510d350d4e0c50ee9481ad Apr 24 23:54:45.909464 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.909421 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" event={"ID":"94089fb1-ab0c-4039-8198-ad6fb5562f6a","Type":"ContainerStarted","Data":"0341c86e1c02716460b08fd42f00668687d69c63f466a84be6a21e3a603ef79d"} Apr 24 23:54:45.910717 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.910682 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ddw4g" event={"ID":"dd8e0c2d-19dc-4d83-a697-7eca0069d620","Type":"ContainerStarted","Data":"08dd718ca184971c1d9c6c2e8843dc84b39c853a1f510d350d4e0c50ee9481ad"} Apr 24 23:54:45.912448 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.912424 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" event={"ID":"ea7d009e-6127-4f50-8b69-48fd33a90311","Type":"ContainerStarted","Data":"7ec3901612593c9068ca1d73f9cd9681c8d45f4e6f61123312487f1f6371d7d7"} Apr 24 23:54:45.912541 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.912457 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" event={"ID":"ea7d009e-6127-4f50-8b69-48fd33a90311","Type":"ContainerStarted","Data":"423cd126e4bfff340cd98f885ccedde43f0da36e21a74e5b36afd02f39135948"} Apr 24 23:54:45.912541 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:45.912470 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" event={"ID":"ea7d009e-6127-4f50-8b69-48fd33a90311","Type":"ContainerStarted","Data":"b42c8ee3aa92a36513fd23f403aa4e2e38f9491b504da45a99945fabaad861a2"} Apr 24 23:54:47.202482 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.202345 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jcrnq"] Apr 24 23:54:47.206705 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.206668 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.212151 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.210868 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 23:54:47.212151 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.210980 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 23:54:47.212151 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.210868 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 23:54:47.212151 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.211329 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 23:54:47.212151 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.211691 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-r9wdl\"" Apr 24 23:54:47.225804 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.222631 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jcrnq"] Apr 24 23:54:47.266442 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.266314 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg87l\" (UniqueName: \"kubernetes.io/projected/7a1d9e52-82af-467c-b492-7a7aa2f20acb-kube-api-access-zg87l\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.266442 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.266394 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7a1d9e52-82af-467c-b492-7a7aa2f20acb-crio-socket\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.266442 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.266424 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7a1d9e52-82af-467c-b492-7a7aa2f20acb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.266707 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.266487 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a1d9e52-82af-467c-b492-7a7aa2f20acb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.266707 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.266624 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a1d9e52-82af-467c-b492-7a7aa2f20acb-data-volume\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.368049 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.367129 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a1d9e52-82af-467c-b492-7a7aa2f20acb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.368049 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.367217 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a1d9e52-82af-467c-b492-7a7aa2f20acb-data-volume\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.368049 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.367256 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zg87l\" (UniqueName: \"kubernetes.io/projected/7a1d9e52-82af-467c-b492-7a7aa2f20acb-kube-api-access-zg87l\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.368049 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.367308 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7a1d9e52-82af-467c-b492-7a7aa2f20acb-crio-socket\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.368049 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.367334 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7a1d9e52-82af-467c-b492-7a7aa2f20acb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.368049 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.367832 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a1d9e52-82af-467c-b492-7a7aa2f20acb-data-volume\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.368049 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.367950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7a1d9e52-82af-467c-b492-7a7aa2f20acb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.368481 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.368084 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7a1d9e52-82af-467c-b492-7a7aa2f20acb-crio-socket\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.370495 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.370477 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a1d9e52-82af-467c-b492-7a7aa2f20acb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.378965 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.378945 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg87l\" (UniqueName: \"kubernetes.io/projected/7a1d9e52-82af-467c-b492-7a7aa2f20acb-kube-api-access-zg87l\") pod \"insights-runtime-extractor-jcrnq\" (UID: \"7a1d9e52-82af-467c-b492-7a7aa2f20acb\") " pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.535502 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.535473 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-r9wdl\"" Apr 24 23:54:47.542530 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.542392 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jcrnq" Apr 24 23:54:47.622525 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:47.621957 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd8e0c2d_19dc_4d83_a697_7eca0069d620.slice/crio-726e444e69645903890a4c07a6af69d9ad8bc69c8c3b3050327d52cbbd92e013.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd8e0c2d_19dc_4d83_a697_7eca0069d620.slice/crio-conmon-726e444e69645903890a4c07a6af69d9ad8bc69c8c3b3050327d52cbbd92e013.scope\": RecentStats: unable to find data in memory cache]" Apr 24 23:54:47.710069 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.710034 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jcrnq"] Apr 24 23:54:47.713604 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:47.713569 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a1d9e52_82af_467c_b492_7a7aa2f20acb.slice/crio-2f524c3a54b56e944ff1028ac8db01545b20116e378f19fa6a1317f35d9b309e WatchSource:0}: Error finding container 2f524c3a54b56e944ff1028ac8db01545b20116e378f19fa6a1317f35d9b309e: Status 404 returned error can't find the container with id 2f524c3a54b56e944ff1028ac8db01545b20116e378f19fa6a1317f35d9b309e Apr 24 23:54:47.920779 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.920740 2569 generic.go:358] "Generic (PLEG): container finished" podID="dd8e0c2d-19dc-4d83-a697-7eca0069d620" containerID="726e444e69645903890a4c07a6af69d9ad8bc69c8c3b3050327d52cbbd92e013" exitCode=0 Apr 24 23:54:47.920960 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.920785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ddw4g" event={"ID":"dd8e0c2d-19dc-4d83-a697-7eca0069d620","Type":"ContainerDied","Data":"726e444e69645903890a4c07a6af69d9ad8bc69c8c3b3050327d52cbbd92e013"} Apr 24 23:54:47.923468 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.923437 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" event={"ID":"ea7d009e-6127-4f50-8b69-48fd33a90311","Type":"ContainerStarted","Data":"5a2e0eebaeed05ef0f1768cc727b983881e522d8c54df14d1cc52f17e04068e4"} Apr 24 23:54:47.925141 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.925117 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jcrnq" event={"ID":"7a1d9e52-82af-467c-b492-7a7aa2f20acb","Type":"ContainerStarted","Data":"3eb3e1033b2d00e25f22ebdb0159a8dad3272d368e85a1c89d782a750e00f80d"} Apr 24 23:54:47.925242 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.925146 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jcrnq" event={"ID":"7a1d9e52-82af-467c-b492-7a7aa2f20acb","Type":"ContainerStarted","Data":"2f524c3a54b56e944ff1028ac8db01545b20116e378f19fa6a1317f35d9b309e"} Apr 24 23:54:47.927585 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.927562 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" event={"ID":"94089fb1-ab0c-4039-8198-ad6fb5562f6a","Type":"ContainerStarted","Data":"4acb347b38c8b097e7898115f79a10d969cef4b729f2d6400fed8ae40c869775"} Apr 24 23:54:47.927672 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.927613 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" event={"ID":"94089fb1-ab0c-4039-8198-ad6fb5562f6a","Type":"ContainerStarted","Data":"782bded3da38f8320e943051518552442d13e56c6d540cacfa82116b82902ed0"} Apr 24 23:54:47.927672 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.927630 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" event={"ID":"94089fb1-ab0c-4039-8198-ad6fb5562f6a","Type":"ContainerStarted","Data":"5b595a48eabda8f4c079b704bcca507c690cd43961f6ad9b2c20c0823b3d77c8"} Apr 24 23:54:47.977798 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.977119 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-58c9db9969-hg2cg"] Apr 24 23:54:47.977798 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.977606 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hdcq9" podStartSLOduration=2.201840984 podStartE2EDuration="3.977591129s" podCreationTimestamp="2026-04-24 23:54:44 +0000 UTC" firstStartedPulling="2026-04-24 23:54:45.526070878 +0000 UTC m=+58.516474871" lastFinishedPulling="2026-04-24 23:54:47.301821015 +0000 UTC m=+60.292225016" observedRunningTime="2026-04-24 23:54:47.974954659 +0000 UTC m=+60.965358696" watchObservedRunningTime="2026-04-24 23:54:47.977591129 +0000 UTC m=+60.967995143" Apr 24 23:54:47.985519 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.985493 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:47.988267 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.987544 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 23:54:47.988267 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.987798 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-b2b0kquo6bj8k\"" Apr 24 23:54:47.988267 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.987847 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 23:54:47.988267 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.987997 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-82chl\"" Apr 24 23:54:47.988267 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.988056 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 23:54:47.988267 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.988092 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 23:54:47.988267 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.988112 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 23:54:47.993735 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:47.993716 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-58c9db9969-hg2cg"] Apr 24 23:54:48.001462 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.001421 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-xb82g" podStartSLOduration=2.30297378 podStartE2EDuration="4.001407302s" podCreationTimestamp="2026-04-24 23:54:44 +0000 UTC" firstStartedPulling="2026-04-24 23:54:45.606126647 +0000 UTC m=+58.596530667" lastFinishedPulling="2026-04-24 23:54:47.304560194 +0000 UTC m=+60.294964189" observedRunningTime="2026-04-24 23:54:48.00030648 +0000 UTC m=+60.990710537" watchObservedRunningTime="2026-04-24 23:54:48.001407302 +0000 UTC m=+60.991811319" Apr 24 23:54:48.073703 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.073672 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwmhm\" (UniqueName: \"kubernetes.io/projected/43258f38-cf4f-4e21-b609-e209e07de132-kube-api-access-cwmhm\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.073816 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.073745 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.073816 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.073777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43258f38-cf4f-4e21-b609-e209e07de132-metrics-client-ca\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.073816 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.073806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-tls\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.073985 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.073867 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.073985 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.073907 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-grpc-tls\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.074087 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.074019 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.074137 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.074111 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.175500 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.175461 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwmhm\" (UniqueName: \"kubernetes.io/projected/43258f38-cf4f-4e21-b609-e209e07de132-kube-api-access-cwmhm\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.175669 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.175527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.175669 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.175552 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43258f38-cf4f-4e21-b609-e209e07de132-metrics-client-ca\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.175669 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.175580 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-tls\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.175802 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.175747 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.175802 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.175791 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-grpc-tls\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.175873 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.175837 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.175873 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.175865 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.176506 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.176423 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43258f38-cf4f-4e21-b609-e209e07de132-metrics-client-ca\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.179604 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.179573 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.179725 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.179661 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.180248 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.180206 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-tls\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.180477 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.180458 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.180912 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.180881 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-grpc-tls\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.181624 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.181584 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43258f38-cf4f-4e21-b609-e209e07de132-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.183786 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.183749 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwmhm\" (UniqueName: \"kubernetes.io/projected/43258f38-cf4f-4e21-b609-e209e07de132-kube-api-access-cwmhm\") pod \"thanos-querier-58c9db9969-hg2cg\" (UID: \"43258f38-cf4f-4e21-b609-e209e07de132\") " pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.298777 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.298692 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:54:48.450805 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.450551 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-58c9db9969-hg2cg"] Apr 24 23:54:48.455240 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:48.455209 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43258f38_cf4f_4e21_b609_e209e07de132.slice/crio-b850694288c0f3f4a4206bc4c20fbe977bda682e984717b374d2b720bc1e94d4 WatchSource:0}: Error finding container b850694288c0f3f4a4206bc4c20fbe977bda682e984717b374d2b720bc1e94d4: Status 404 returned error can't find the container with id b850694288c0f3f4a4206bc4c20fbe977bda682e984717b374d2b720bc1e94d4 Apr 24 23:54:48.933126 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.933044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ddw4g" event={"ID":"dd8e0c2d-19dc-4d83-a697-7eca0069d620","Type":"ContainerStarted","Data":"364dd9c68324c10dd20c9b12c86cb7f16339034b68ba2fdff03bd8de6195ec80"} Apr 24 23:54:48.933126 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.933100 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ddw4g" event={"ID":"dd8e0c2d-19dc-4d83-a697-7eca0069d620","Type":"ContainerStarted","Data":"9783b22fd38ad011f327d8364c2e12663d7990ec3dfa90805f9504c0874180de"} Apr 24 23:54:48.934962 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.934922 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jcrnq" event={"ID":"7a1d9e52-82af-467c-b492-7a7aa2f20acb","Type":"ContainerStarted","Data":"19206e081c376267db57d7e09d5dfaa8d681ea273524af310d2ee19ca8ce10da"} Apr 24 23:54:48.936334 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.936206 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" event={"ID":"43258f38-cf4f-4e21-b609-e209e07de132","Type":"ContainerStarted","Data":"b850694288c0f3f4a4206bc4c20fbe977bda682e984717b374d2b720bc1e94d4"} Apr 24 23:54:48.955810 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:48.955752 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ddw4g" podStartSLOduration=3.5134014220000003 podStartE2EDuration="4.955733042s" podCreationTimestamp="2026-04-24 23:54:44 +0000 UTC" firstStartedPulling="2026-04-24 23:54:45.859689528 +0000 UTC m=+58.850093525" lastFinishedPulling="2026-04-24 23:54:47.302021139 +0000 UTC m=+60.292425145" observedRunningTime="2026-04-24 23:54:48.953863492 +0000 UTC m=+61.944267506" watchObservedRunningTime="2026-04-24 23:54:48.955733042 +0000 UTC m=+61.946137056" Apr 24 23:54:49.649008 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:49.648972 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl"] Apr 24 23:54:49.652439 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:49.652419 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl" Apr 24 23:54:49.655268 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:49.655247 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 23:54:49.655446 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:49.655348 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-nh9cx\"" Apr 24 23:54:49.660481 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:49.660453 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl"] Apr 24 23:54:49.688034 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:49.688005 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a818f70c-58fa-40be-b78e-11838a11f8b9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fxrzl\" (UID: \"a818f70c-58fa-40be-b78e-11838a11f8b9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl" Apr 24 23:54:49.788444 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:49.788411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a818f70c-58fa-40be-b78e-11838a11f8b9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fxrzl\" (UID: \"a818f70c-58fa-40be-b78e-11838a11f8b9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl" Apr 24 23:54:49.788590 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:49.788564 2569 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 23:54:49.788628 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:54:49.788624 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a818f70c-58fa-40be-b78e-11838a11f8b9-monitoring-plugin-cert podName:a818f70c-58fa-40be-b78e-11838a11f8b9 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:50.288609823 +0000 UTC m=+63.279013814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/a818f70c-58fa-40be-b78e-11838a11f8b9-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-fxrzl" (UID: "a818f70c-58fa-40be-b78e-11838a11f8b9") : secret "monitoring-plugin-cert" not found Apr 24 23:54:49.890990 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:49.890957 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fcfzm" Apr 24 23:54:50.293011 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:50.292975 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a818f70c-58fa-40be-b78e-11838a11f8b9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fxrzl\" (UID: \"a818f70c-58fa-40be-b78e-11838a11f8b9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl" Apr 24 23:54:50.295810 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:50.295781 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a818f70c-58fa-40be-b78e-11838a11f8b9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fxrzl\" (UID: \"a818f70c-58fa-40be-b78e-11838a11f8b9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl" Apr 24 23:54:50.564566 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:50.564485 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl" Apr 24 23:54:52.310801 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:52.310765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:52.313145 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:52.313110 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:52.323693 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:52.323667 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/865ada3d-5576-4faa-98c1-2b867558ffc0-metrics-certs\") pod \"network-metrics-daemon-z72wj\" (UID: \"865ada3d-5576-4faa-98c1-2b867558ffc0\") " pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:52.412462 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:52.412425 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktzp\" (UniqueName: \"kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp\") pod \"network-check-target-zsqnv\" (UID: \"2bca4834-ef71-4be6-ac75-bf2bc736877d\") " pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:52.415709 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:52.415676 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:52.424800 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:52.424771 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:52.445994 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:52.445961 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ktzp\" (UniqueName: \"kubernetes.io/projected/2bca4834-ef71-4be6-ac75-bf2bc736877d-kube-api-access-6ktzp\") pod \"network-check-target-zsqnv\" (UID: \"2bca4834-ef71-4be6-ac75-bf2bc736877d\") " pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:52.608629 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:52.608593 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ghb4v\"" Apr 24 23:54:52.616256 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:52.616229 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h8trl\"" Apr 24 23:54:52.617260 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:52.617243 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:54:52.624957 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:52.624928 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z72wj" Apr 24 23:54:56.493230 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.484416 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-865cb5cb95-mp69m"] Apr 24 23:54:56.495334 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.495306 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-865cb5cb95-mp69m"] Apr 24 23:54:56.495488 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.495472 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.525799 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.511808 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 23:54:56.525799 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.512786 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 23:54:56.525799 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.513214 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 23:54:56.534736 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.527569 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 23:54:56.534736 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.527835 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qmbm4\"" Apr 24 23:54:56.534736 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.528073 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 23:54:56.543423 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.542560 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 23:54:56.549959 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.549714 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-oauth-config\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.549959 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.549764 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbt25\" (UniqueName: \"kubernetes.io/projected/0f2787d4-5657-4c05-92fa-078f95fe836c-kube-api-access-zbt25\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.549959 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.549816 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-trusted-ca-bundle\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.549959 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.549852 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-oauth-serving-cert\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.549959 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.549896 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-console-config\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.550344 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.550004 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-serving-cert\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.550344 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.550049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-service-ca\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.602666 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.602637 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl"] Apr 24 23:54:56.609580 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:56.609542 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda818f70c_58fa_40be_b78e_11838a11f8b9.slice/crio-88d0ffb3f0b0e1035e2f9f6a80614f03f371f0f6654e9a1e217672fef73f5b3b WatchSource:0}: Error finding container 88d0ffb3f0b0e1035e2f9f6a80614f03f371f0f6654e9a1e217672fef73f5b3b: Status 404 returned error can't find the container with id 88d0ffb3f0b0e1035e2f9f6a80614f03f371f0f6654e9a1e217672fef73f5b3b Apr 24 23:54:56.650644 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.650608 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-trusted-ca-bundle\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.650820 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.650666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-oauth-serving-cert\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.650820 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.650705 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-console-config\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.650820 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.650745 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-serving-cert\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.650820 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.650772 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-service-ca\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.650820 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.650808 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-oauth-config\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.651061 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.650838 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbt25\" (UniqueName: \"kubernetes.io/projected/0f2787d4-5657-4c05-92fa-078f95fe836c-kube-api-access-zbt25\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.651560 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.651459 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-oauth-serving-cert\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.651704 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.651583 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-trusted-ca-bundle\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.651919 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.651895 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-service-ca\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.652007 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.651959 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-console-config\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.653874 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.653844 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-oauth-config\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.654674 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.654653 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-serving-cert\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.659276 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.659218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbt25\" (UniqueName: \"kubernetes.io/projected/0f2787d4-5657-4c05-92fa-078f95fe836c-kube-api-access-zbt25\") pod \"console-865cb5cb95-mp69m\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.829764 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.829628 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zsqnv"] Apr 24 23:54:56.833875 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:56.833809 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bca4834_ef71_4be6_ac75_bf2bc736877d.slice/crio-639a5fe10cb6ad99ca3a999c43d1372197303d5438c2301459ccea2f7f99c9cf WatchSource:0}: Error finding container 639a5fe10cb6ad99ca3a999c43d1372197303d5438c2301459ccea2f7f99c9cf: Status 404 returned error can't find the container with id 639a5fe10cb6ad99ca3a999c43d1372197303d5438c2301459ccea2f7f99c9cf Apr 24 23:54:56.833999 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.833875 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z72wj"] Apr 24 23:54:56.838421 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:56.838389 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865ada3d_5576_4faa_98c1_2b867558ffc0.slice/crio-dda1c4315b81e1364714df896a86acef9c4f28a3e0a0c29ef85cdd0aa133053a WatchSource:0}: Error finding container dda1c4315b81e1364714df896a86acef9c4f28a3e0a0c29ef85cdd0aa133053a: Status 404 returned error can't find the container with id dda1c4315b81e1364714df896a86acef9c4f28a3e0a0c29ef85cdd0aa133053a Apr 24 23:54:56.843313 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.843062 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:54:56.964096 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.964036 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z72wj" event={"ID":"865ada3d-5576-4faa-98c1-2b867558ffc0","Type":"ContainerStarted","Data":"dda1c4315b81e1364714df896a86acef9c4f28a3e0a0c29ef85cdd0aa133053a"} Apr 24 23:54:56.965925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.965890 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl" event={"ID":"a818f70c-58fa-40be-b78e-11838a11f8b9","Type":"ContainerStarted","Data":"88d0ffb3f0b0e1035e2f9f6a80614f03f371f0f6654e9a1e217672fef73f5b3b"} Apr 24 23:54:56.968588 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.968538 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jcrnq" event={"ID":"7a1d9e52-82af-467c-b492-7a7aa2f20acb","Type":"ContainerStarted","Data":"4afd61c273d4712e31a31f0f26e4fa2757af69920cdfe14136043a0ff8b1e1ea"} Apr 24 23:54:56.970729 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.970678 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-87zhf" event={"ID":"ce25c16f-218e-4c97-a5d9-fc6cb1293ba6","Type":"ContainerStarted","Data":"3620ee3dcb5bd3bf77747d32ae3d61f73522424f43c3db6fdd9107bcb648a62d"} Apr 24 23:54:56.970862 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.970833 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-87zhf" Apr 24 23:54:56.972623 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.972590 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zsqnv" event={"ID":"2bca4834-ef71-4be6-ac75-bf2bc736877d","Type":"ContainerStarted","Data":"639a5fe10cb6ad99ca3a999c43d1372197303d5438c2301459ccea2f7f99c9cf"} Apr 24 23:54:56.979911 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.979882 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-87zhf" Apr 24 23:54:56.985589 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.985544 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jcrnq" podStartSLOduration=1.3520524250000001 podStartE2EDuration="9.985531115s" podCreationTimestamp="2026-04-24 23:54:47 +0000 UTC" firstStartedPulling="2026-04-24 23:54:47.780051159 +0000 UTC m=+60.770455153" lastFinishedPulling="2026-04-24 23:54:56.413529851 +0000 UTC m=+69.403933843" observedRunningTime="2026-04-24 23:54:56.985087324 +0000 UTC m=+69.975491351" watchObservedRunningTime="2026-04-24 23:54:56.985531115 +0000 UTC m=+69.975935181" Apr 24 23:54:56.993038 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.993012 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-865cb5cb95-mp69m"] Apr 24 23:54:56.996908 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:54:56.996876 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f2787d4_5657_4c05_92fa_078f95fe836c.slice/crio-934bcf03ce3612290814e2734281c2ce394643e170bd6e123a0995e5f9507d5b WatchSource:0}: Error finding container 934bcf03ce3612290814e2734281c2ce394643e170bd6e123a0995e5f9507d5b: Status 404 returned error can't find the container with id 934bcf03ce3612290814e2734281c2ce394643e170bd6e123a0995e5f9507d5b Apr 24 23:54:57.000008 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:56.999887 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-87zhf" podStartSLOduration=2.704934078 podStartE2EDuration="19.999871269s" podCreationTimestamp="2026-04-24 23:54:37 +0000 UTC" firstStartedPulling="2026-04-24 23:54:39.174455579 +0000 UTC m=+52.164859570" lastFinishedPulling="2026-04-24 23:54:56.46939276 +0000 UTC m=+69.459796761" observedRunningTime="2026-04-24 23:54:56.999195155 +0000 UTC m=+69.989599172" watchObservedRunningTime="2026-04-24 23:54:56.999871269 +0000 UTC m=+69.990275282" Apr 24 23:54:57.993900 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:57.993821 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865cb5cb95-mp69m" event={"ID":"0f2787d4-5657-4c05-92fa-078f95fe836c","Type":"ContainerStarted","Data":"934bcf03ce3612290814e2734281c2ce394643e170bd6e123a0995e5f9507d5b"} Apr 24 23:54:57.997435 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:57.997407 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" event={"ID":"43258f38-cf4f-4e21-b609-e209e07de132","Type":"ContainerStarted","Data":"46717f80faecb1b68e17873832103d7b9bafd9a65dcf2507d9f8259967ae01fc"} Apr 24 23:54:57.997575 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:57.997562 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" event={"ID":"43258f38-cf4f-4e21-b609-e209e07de132","Type":"ContainerStarted","Data":"b99fc6992e4241504bfd8ebba3ebdedddecc1015cbfe02fda96bd7c8325f4650"} Apr 24 23:54:59.007234 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:54:59.007178 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" event={"ID":"43258f38-cf4f-4e21-b609-e209e07de132","Type":"ContainerStarted","Data":"0d576e54ac785772933ba6963b0f065e00f14c30527b26fb9c6ad86189bf2867"} Apr 24 23:55:04.032337 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:04.032301 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" event={"ID":"43258f38-cf4f-4e21-b609-e209e07de132","Type":"ContainerStarted","Data":"19e58aa9991f320ffaeb390ee968ef8d8a199167f34d46b27a4f1e2ff8feed52"} Apr 24 23:55:04.032745 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:04.032349 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" event={"ID":"43258f38-cf4f-4e21-b609-e209e07de132","Type":"ContainerStarted","Data":"def89e3844d5f433175ccbfaa8143565f382bd02b29f74f9478cb452ae12d84e"} Apr 24 23:55:04.034404 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:04.033846 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zsqnv" event={"ID":"2bca4834-ef71-4be6-ac75-bf2bc736877d","Type":"ContainerStarted","Data":"5f22fa1670fcc5a406d7982721c939b8dab21b0c969bcb7149f30da894bf8bf8"} Apr 24 23:55:04.034404 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:04.034204 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:55:04.035830 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:04.035779 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865cb5cb95-mp69m" event={"ID":"0f2787d4-5657-4c05-92fa-078f95fe836c","Type":"ContainerStarted","Data":"58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd"} Apr 24 23:55:04.039925 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:04.039533 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z72wj" event={"ID":"865ada3d-5576-4faa-98c1-2b867558ffc0","Type":"ContainerStarted","Data":"7c5bf061313957da359833fc6d4b3e797dab995a852a5966bbeda4be583856f1"} Apr 24 23:55:04.041161 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:04.041137 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl" event={"ID":"a818f70c-58fa-40be-b78e-11838a11f8b9","Type":"ContainerStarted","Data":"7c3e7beda4dc8c4c40cc122ef539113efe8abb11321f860f0788a16b6d08c153"} Apr 24 23:55:04.041399 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:04.041344 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl" Apr 24 23:55:04.048062 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:04.048027 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl" Apr 24 23:55:04.053670 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:04.052681 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zsqnv" podStartSLOduration=70.325450231 podStartE2EDuration="1m17.052664958s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:54:56.836258392 +0000 UTC m=+69.826662395" lastFinishedPulling="2026-04-24 23:55:03.563473116 +0000 UTC m=+76.553877122" observedRunningTime="2026-04-24 23:55:04.050451329 +0000 UTC m=+77.040855344" watchObservedRunningTime="2026-04-24 23:55:04.052664958 +0000 UTC m=+77.043068972" Apr 24 23:55:04.068527 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:04.068448 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fxrzl" podStartSLOduration=8.159174615 podStartE2EDuration="15.068428363s" podCreationTimestamp="2026-04-24 23:54:49 +0000 UTC" firstStartedPulling="2026-04-24 23:54:56.611176046 +0000 UTC m=+69.601580042" lastFinishedPulling="2026-04-24 23:55:03.520429794 +0000 UTC m=+76.510833790" observedRunningTime="2026-04-24 23:55:04.066398609 +0000 UTC m=+77.056802628" watchObservedRunningTime="2026-04-24 23:55:04.068428363 +0000 UTC m=+77.058832380" Apr 24 23:55:05.053301 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:05.053261 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" event={"ID":"43258f38-cf4f-4e21-b609-e209e07de132","Type":"ContainerStarted","Data":"9076b6996f6f6517fad0a71d5bc87150a77932b847d7cdbab5c9c91ceec7ee16"} Apr 24 23:55:05.053760 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:05.053634 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:55:05.055578 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:05.055534 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z72wj" event={"ID":"865ada3d-5576-4faa-98c1-2b867558ffc0","Type":"ContainerStarted","Data":"deb5d5ae625811c0f4a8a8f55318703ef7ca6b1d957de8efd57825dfaa1fe8b0"} Apr 24 23:55:05.060513 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:05.060492 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" Apr 24 23:55:05.078535 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:05.077752 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-865cb5cb95-mp69m" podStartSLOduration=2.523555535 podStartE2EDuration="9.077737416s" podCreationTimestamp="2026-04-24 23:54:56 +0000 UTC" firstStartedPulling="2026-04-24 23:54:56.99925424 +0000 UTC m=+69.989658239" lastFinishedPulling="2026-04-24 23:55:03.55343612 +0000 UTC m=+76.543840120" observedRunningTime="2026-04-24 23:55:04.087038341 +0000 UTC m=+77.077442380" watchObservedRunningTime="2026-04-24 23:55:05.077737416 +0000 UTC m=+78.068141431" Apr 24 23:55:05.078830 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:05.078794 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-58c9db9969-hg2cg" podStartSLOduration=3.016127589 podStartE2EDuration="18.078783227s" podCreationTimestamp="2026-04-24 23:54:47 +0000 UTC" firstStartedPulling="2026-04-24 23:54:48.457773176 +0000 UTC m=+61.448177172" lastFinishedPulling="2026-04-24 23:55:03.520428804 +0000 UTC m=+76.510832810" observedRunningTime="2026-04-24 23:55:05.076786723 +0000 UTC m=+78.067190736" watchObservedRunningTime="2026-04-24 23:55:05.078783227 +0000 UTC m=+78.069187241" Apr 24 23:55:05.091058 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:05.091011 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z72wj" podStartSLOduration=71.856691817 podStartE2EDuration="1m18.090997817s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:54:56.841524089 +0000 UTC m=+69.831928083" lastFinishedPulling="2026-04-24 23:55:03.075830084 +0000 UTC m=+76.066234083" observedRunningTime="2026-04-24 23:55:05.09069876 +0000 UTC m=+78.081102767" watchObservedRunningTime="2026-04-24 23:55:05.090997817 +0000 UTC m=+78.081401833" Apr 24 23:55:06.843636 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:06.843597 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:55:06.843636 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:06.843641 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:55:06.848330 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:06.848306 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:55:07.065897 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:07.065868 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:55:35.057967 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:55:35.057936 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zsqnv" Apr 24 23:56:10.835883 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:10.835849 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-545b6b789f-sk9dk"] Apr 24 23:56:10.839102 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:10.839081 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:10.849042 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:10.849017 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-545b6b789f-sk9dk"] Apr 24 23:56:10.927108 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:10.927068 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-service-ca\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:10.927108 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:10.927105 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-oauth-serving-cert\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:10.927301 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:10.927127 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-oauth-config\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:10.927301 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:10.927141 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh8h6\" (UniqueName: \"kubernetes.io/projected/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-kube-api-access-rh8h6\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:10.927301 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:10.927187 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-serving-cert\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:10.927301 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:10.927237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-trusted-ca-bundle\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:10.927301 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:10.927253 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-config\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.028132 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.028092 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-service-ca\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.028132 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.028132 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-oauth-serving-cert\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.028413 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.028164 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-oauth-config\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.028413 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.028187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rh8h6\" (UniqueName: \"kubernetes.io/projected/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-kube-api-access-rh8h6\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.028413 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.028240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-serving-cert\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.028413 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.028296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-trusted-ca-bundle\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.028413 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.028323 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-config\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.028950 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.028922 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-oauth-serving-cert\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.028950 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.028937 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-service-ca\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.029117 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.029098 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-config\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.029176 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.029132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-trusted-ca-bundle\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.030640 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.030615 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-oauth-config\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.030747 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.030729 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-serving-cert\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.036045 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.036023 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh8h6\" (UniqueName: \"kubernetes.io/projected/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-kube-api-access-rh8h6\") pod \"console-545b6b789f-sk9dk\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.148891 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.148857 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:11.267659 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:11.267628 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-545b6b789f-sk9dk"] Apr 24 23:56:11.272607 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:56:11.272584 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a24722d_c0e1_469a_b8dd_f2c0b3a5e5cd.slice/crio-53be2d5ac9641c7dbbef4850294612767e424e4693a65c4d00bdc6b30bd9afac WatchSource:0}: Error finding container 53be2d5ac9641c7dbbef4850294612767e424e4693a65c4d00bdc6b30bd9afac: Status 404 returned error can't find the container with id 53be2d5ac9641c7dbbef4850294612767e424e4693a65c4d00bdc6b30bd9afac Apr 24 23:56:12.246384 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:12.246330 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-545b6b789f-sk9dk" event={"ID":"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd","Type":"ContainerStarted","Data":"0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220"} Apr 24 23:56:12.246782 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:12.246392 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-545b6b789f-sk9dk" event={"ID":"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd","Type":"ContainerStarted","Data":"53be2d5ac9641c7dbbef4850294612767e424e4693a65c4d00bdc6b30bd9afac"} Apr 24 23:56:12.263906 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:12.263862 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-545b6b789f-sk9dk" podStartSLOduration=2.263848326 podStartE2EDuration="2.263848326s" podCreationTimestamp="2026-04-24 23:56:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:12.262620708 +0000 UTC m=+145.253024723" watchObservedRunningTime="2026-04-24 23:56:12.263848326 +0000 UTC m=+145.254252336" Apr 24 23:56:21.149022 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:21.148985 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:21.149022 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:21.149030 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:21.153605 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:21.153582 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:21.274226 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:21.274192 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-545b6b789f-sk9dk" Apr 24 23:56:21.328415 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:21.328385 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-865cb5cb95-mp69m"] Apr 24 23:56:46.347414 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.347337 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-865cb5cb95-mp69m" podUID="0f2787d4-5657-4c05-92fa-078f95fe836c" containerName="console" containerID="cri-o://58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd" gracePeriod=15 Apr 24 23:56:46.585289 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.585267 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-865cb5cb95-mp69m_0f2787d4-5657-4c05-92fa-078f95fe836c/console/0.log" Apr 24 23:56:46.585416 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.585340 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:56:46.715550 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.715503 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-oauth-serving-cert\") pod \"0f2787d4-5657-4c05-92fa-078f95fe836c\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " Apr 24 23:56:46.715738 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.715561 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-trusted-ca-bundle\") pod \"0f2787d4-5657-4c05-92fa-078f95fe836c\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " Apr 24 23:56:46.715738 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.715587 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-service-ca\") pod \"0f2787d4-5657-4c05-92fa-078f95fe836c\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " Apr 24 23:56:46.715738 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.715610 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-serving-cert\") pod \"0f2787d4-5657-4c05-92fa-078f95fe836c\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " Apr 24 23:56:46.715738 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.715635 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbt25\" (UniqueName: \"kubernetes.io/projected/0f2787d4-5657-4c05-92fa-078f95fe836c-kube-api-access-zbt25\") pod \"0f2787d4-5657-4c05-92fa-078f95fe836c\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " Apr 24 23:56:46.715738 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.715709 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-oauth-config\") pod \"0f2787d4-5657-4c05-92fa-078f95fe836c\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " Apr 24 23:56:46.715738 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.715735 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-console-config\") pod \"0f2787d4-5657-4c05-92fa-078f95fe836c\" (UID: \"0f2787d4-5657-4c05-92fa-078f95fe836c\") " Apr 24 23:56:46.716025 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.715934 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0f2787d4-5657-4c05-92fa-078f95fe836c" (UID: "0f2787d4-5657-4c05-92fa-078f95fe836c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:46.716025 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.716006 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-service-ca" (OuterVolumeSpecName: "service-ca") pod "0f2787d4-5657-4c05-92fa-078f95fe836c" (UID: "0f2787d4-5657-4c05-92fa-078f95fe836c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:46.716223 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.716073 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0f2787d4-5657-4c05-92fa-078f95fe836c" (UID: "0f2787d4-5657-4c05-92fa-078f95fe836c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:46.716347 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.716265 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-console-config" (OuterVolumeSpecName: "console-config") pod "0f2787d4-5657-4c05-92fa-078f95fe836c" (UID: "0f2787d4-5657-4c05-92fa-078f95fe836c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:56:46.717886 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.717860 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0f2787d4-5657-4c05-92fa-078f95fe836c" (UID: "0f2787d4-5657-4c05-92fa-078f95fe836c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:46.717966 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.717916 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0f2787d4-5657-4c05-92fa-078f95fe836c" (UID: "0f2787d4-5657-4c05-92fa-078f95fe836c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:56:46.718008 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.717982 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2787d4-5657-4c05-92fa-078f95fe836c-kube-api-access-zbt25" (OuterVolumeSpecName: "kube-api-access-zbt25") pod "0f2787d4-5657-4c05-92fa-078f95fe836c" (UID: "0f2787d4-5657-4c05-92fa-078f95fe836c"). InnerVolumeSpecName "kube-api-access-zbt25". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:56:46.817183 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.817147 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-oauth-serving-cert\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 24 23:56:46.817183 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.817180 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-trusted-ca-bundle\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 24 23:56:46.817183 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.817191 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-service-ca\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 24 23:56:46.817434 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.817201 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-serving-cert\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 24 23:56:46.817434 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.817209 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zbt25\" (UniqueName: \"kubernetes.io/projected/0f2787d4-5657-4c05-92fa-078f95fe836c-kube-api-access-zbt25\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 24 23:56:46.817434 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.817219 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f2787d4-5657-4c05-92fa-078f95fe836c-console-oauth-config\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 24 23:56:46.817434 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:46.817228 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f2787d4-5657-4c05-92fa-078f95fe836c-console-config\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 24 23:56:47.343115 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:47.343090 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-865cb5cb95-mp69m_0f2787d4-5657-4c05-92fa-078f95fe836c/console/0.log" Apr 24 23:56:47.343272 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:47.343128 2569 generic.go:358] "Generic (PLEG): container finished" podID="0f2787d4-5657-4c05-92fa-078f95fe836c" containerID="58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd" exitCode=2 Apr 24 23:56:47.343272 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:47.343183 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865cb5cb95-mp69m" event={"ID":"0f2787d4-5657-4c05-92fa-078f95fe836c","Type":"ContainerDied","Data":"58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd"} Apr 24 23:56:47.343272 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:47.343225 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-865cb5cb95-mp69m" event={"ID":"0f2787d4-5657-4c05-92fa-078f95fe836c","Type":"ContainerDied","Data":"934bcf03ce3612290814e2734281c2ce394643e170bd6e123a0995e5f9507d5b"} Apr 24 23:56:47.343272 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:47.343242 2569 scope.go:117] "RemoveContainer" containerID="58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd" Apr 24 23:56:47.343490 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:47.343195 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-865cb5cb95-mp69m" Apr 24 23:56:47.352298 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:47.352202 2569 scope.go:117] "RemoveContainer" containerID="58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd" Apr 24 23:56:47.352563 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:56:47.352457 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd\": container with ID starting with 58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd not found: ID does not exist" containerID="58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd" Apr 24 23:56:47.352563 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:47.352483 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd"} err="failed to get container status \"58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd\": rpc error: code = NotFound desc = could not find container \"58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd\": container with ID starting with 58290097c179d4966c23094c42edd5006f84919567391699d78ee77aff6d0ccd not found: ID does not exist" Apr 24 23:56:47.363527 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:47.363504 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-865cb5cb95-mp69m"] Apr 24 23:56:47.367664 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:47.367644 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-865cb5cb95-mp69m"] Apr 24 23:56:47.553172 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:47.553139 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f2787d4-5657-4c05-92fa-078f95fe836c" path="/var/lib/kubelet/pods/0f2787d4-5657-4c05-92fa-078f95fe836c/volumes" Apr 24 23:56:52.369692 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.369654 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bvkc6"] Apr 24 23:56:52.370133 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.369952 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f2787d4-5657-4c05-92fa-078f95fe836c" containerName="console" Apr 24 23:56:52.370133 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.369963 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2787d4-5657-4c05-92fa-078f95fe836c" containerName="console" Apr 24 23:56:52.370133 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.370016 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f2787d4-5657-4c05-92fa-078f95fe836c" containerName="console" Apr 24 23:56:52.374411 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.374392 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bvkc6" Apr 24 23:56:52.382225 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.382206 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:56:52.389202 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.389179 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bvkc6"] Apr 24 23:56:52.461052 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.461017 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/753e02b2-3f16-4868-8a5c-824bcd6fe13f-kubelet-config\") pod \"global-pull-secret-syncer-bvkc6\" (UID: \"753e02b2-3f16-4868-8a5c-824bcd6fe13f\") " pod="kube-system/global-pull-secret-syncer-bvkc6" Apr 24 23:56:52.461203 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.461066 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/753e02b2-3f16-4868-8a5c-824bcd6fe13f-dbus\") pod \"global-pull-secret-syncer-bvkc6\" (UID: \"753e02b2-3f16-4868-8a5c-824bcd6fe13f\") " pod="kube-system/global-pull-secret-syncer-bvkc6" Apr 24 23:56:52.461203 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.461088 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/753e02b2-3f16-4868-8a5c-824bcd6fe13f-original-pull-secret\") pod \"global-pull-secret-syncer-bvkc6\" (UID: \"753e02b2-3f16-4868-8a5c-824bcd6fe13f\") " pod="kube-system/global-pull-secret-syncer-bvkc6" Apr 24 23:56:52.562022 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.561991 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/753e02b2-3f16-4868-8a5c-824bcd6fe13f-kubelet-config\") pod \"global-pull-secret-syncer-bvkc6\" (UID: \"753e02b2-3f16-4868-8a5c-824bcd6fe13f\") " pod="kube-system/global-pull-secret-syncer-bvkc6" Apr 24 23:56:52.562187 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.562040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/753e02b2-3f16-4868-8a5c-824bcd6fe13f-dbus\") pod \"global-pull-secret-syncer-bvkc6\" (UID: \"753e02b2-3f16-4868-8a5c-824bcd6fe13f\") " pod="kube-system/global-pull-secret-syncer-bvkc6" Apr 24 23:56:52.562187 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.562061 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/753e02b2-3f16-4868-8a5c-824bcd6fe13f-original-pull-secret\") pod \"global-pull-secret-syncer-bvkc6\" (UID: \"753e02b2-3f16-4868-8a5c-824bcd6fe13f\") " pod="kube-system/global-pull-secret-syncer-bvkc6" Apr 24 23:56:52.562187 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.562118 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/753e02b2-3f16-4868-8a5c-824bcd6fe13f-kubelet-config\") pod \"global-pull-secret-syncer-bvkc6\" (UID: \"753e02b2-3f16-4868-8a5c-824bcd6fe13f\") " pod="kube-system/global-pull-secret-syncer-bvkc6" Apr 24 23:56:52.562291 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.562248 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/753e02b2-3f16-4868-8a5c-824bcd6fe13f-dbus\") pod \"global-pull-secret-syncer-bvkc6\" (UID: \"753e02b2-3f16-4868-8a5c-824bcd6fe13f\") " pod="kube-system/global-pull-secret-syncer-bvkc6" Apr 24 23:56:52.564327 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.564310 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/753e02b2-3f16-4868-8a5c-824bcd6fe13f-original-pull-secret\") pod \"global-pull-secret-syncer-bvkc6\" (UID: \"753e02b2-3f16-4868-8a5c-824bcd6fe13f\") " pod="kube-system/global-pull-secret-syncer-bvkc6" Apr 24 23:56:52.683262 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.683143 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bvkc6" Apr 24 23:56:52.802221 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:52.802191 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bvkc6"] Apr 24 23:56:52.805305 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:56:52.805277 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod753e02b2_3f16_4868_8a5c_824bcd6fe13f.slice/crio-2d1b28a1be20ac9fa475f9616865142a4ee1b12da82874a718515a6df04b6114 WatchSource:0}: Error finding container 2d1b28a1be20ac9fa475f9616865142a4ee1b12da82874a718515a6df04b6114: Status 404 returned error can't find the container with id 2d1b28a1be20ac9fa475f9616865142a4ee1b12da82874a718515a6df04b6114 Apr 24 23:56:53.362416 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:53.362379 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bvkc6" event={"ID":"753e02b2-3f16-4868-8a5c-824bcd6fe13f","Type":"ContainerStarted","Data":"2d1b28a1be20ac9fa475f9616865142a4ee1b12da82874a718515a6df04b6114"} Apr 24 23:56:57.376358 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:57.376318 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bvkc6" event={"ID":"753e02b2-3f16-4868-8a5c-824bcd6fe13f","Type":"ContainerStarted","Data":"56bd1182bd365c11b98e17112386d9ce342c1a482cbadb26aa7cb2da87ac1f14"} Apr 24 23:56:57.391017 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:56:57.390968 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bvkc6" podStartSLOduration=1.886887546 podStartE2EDuration="5.390953884s" podCreationTimestamp="2026-04-24 23:56:52 +0000 UTC" firstStartedPulling="2026-04-24 23:56:52.807322641 +0000 UTC m=+185.797726645" lastFinishedPulling="2026-04-24 23:56:56.31138899 +0000 UTC m=+189.301792983" observedRunningTime="2026-04-24 23:56:57.389228269 +0000 UTC m=+190.379632281" watchObservedRunningTime="2026-04-24 23:56:57.390953884 +0000 UTC m=+190.381357897" Apr 24 23:57:12.095905 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.095868 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp"] Apr 24 23:57:12.099851 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.099833 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp" Apr 24 23:57:12.102191 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.102157 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 23:57:12.102992 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.102969 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 23:57:12.103072 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.102992 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 23:57:12.103072 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.103023 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-68dvc\"" Apr 24 23:57:12.103072 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.102995 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 23:57:12.107834 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.107805 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp"] Apr 24 23:57:12.110698 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.110677 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9f8f1362-41cd-4c6a-9283-5705f3000e5d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d64889f-ll9hp\" (UID: \"9f8f1362-41cd-4c6a-9283-5705f3000e5d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp" Apr 24 23:57:12.110778 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.110728 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhscf\" (UniqueName: \"kubernetes.io/projected/9f8f1362-41cd-4c6a-9283-5705f3000e5d-kube-api-access-fhscf\") pod \"managed-serviceaccount-addon-agent-d64889f-ll9hp\" (UID: \"9f8f1362-41cd-4c6a-9283-5705f3000e5d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp" Apr 24 23:57:12.186240 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.186203 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl"] Apr 24 23:57:12.190415 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.190398 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:12.192905 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.192885 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 23:57:12.197793 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.197773 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl"] Apr 24 23:57:12.211686 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.211659 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9f8f1362-41cd-4c6a-9283-5705f3000e5d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d64889f-ll9hp\" (UID: \"9f8f1362-41cd-4c6a-9283-5705f3000e5d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp" Apr 24 23:57:12.211866 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.211841 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6r4\" (UniqueName: \"kubernetes.io/projected/721c8d52-805f-48c7-865c-a9495c4e10bd-kube-api-access-ft6r4\") pod \"klusterlet-addon-workmgr-6d8765bb79-7qhjl\" (UID: \"721c8d52-805f-48c7-865c-a9495c4e10bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:12.211928 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.211898 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/721c8d52-805f-48c7-865c-a9495c4e10bd-tmp\") pod \"klusterlet-addon-workmgr-6d8765bb79-7qhjl\" (UID: \"721c8d52-805f-48c7-865c-a9495c4e10bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:12.211928 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.211921 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/721c8d52-805f-48c7-865c-a9495c4e10bd-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d8765bb79-7qhjl\" (UID: \"721c8d52-805f-48c7-865c-a9495c4e10bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:12.212023 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.211955 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhscf\" (UniqueName: \"kubernetes.io/projected/9f8f1362-41cd-4c6a-9283-5705f3000e5d-kube-api-access-fhscf\") pod \"managed-serviceaccount-addon-agent-d64889f-ll9hp\" (UID: \"9f8f1362-41cd-4c6a-9283-5705f3000e5d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp" Apr 24 23:57:12.214724 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.214697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9f8f1362-41cd-4c6a-9283-5705f3000e5d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-d64889f-ll9hp\" (UID: \"9f8f1362-41cd-4c6a-9283-5705f3000e5d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp" Apr 24 23:57:12.216944 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.216925 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4"] Apr 24 23:57:12.219817 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.219798 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhscf\" (UniqueName: \"kubernetes.io/projected/9f8f1362-41cd-4c6a-9283-5705f3000e5d-kube-api-access-fhscf\") pod \"managed-serviceaccount-addon-agent-d64889f-ll9hp\" (UID: \"9f8f1362-41cd-4c6a-9283-5705f3000e5d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp" Apr 24 23:57:12.221839 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.221821 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.223749 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.223727 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 23:57:12.223839 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.223773 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 23:57:12.223901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.223777 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 23:57:12.223901 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.223777 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 23:57:12.226965 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.226944 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4"] Apr 24 23:57:12.312684 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.312632 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.312869 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.312707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ft6r4\" (UniqueName: \"kubernetes.io/projected/721c8d52-805f-48c7-865c-a9495c4e10bd-kube-api-access-ft6r4\") pod \"klusterlet-addon-workmgr-6d8765bb79-7qhjl\" (UID: \"721c8d52-805f-48c7-865c-a9495c4e10bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:12.312869 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.312735 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/721c8d52-805f-48c7-865c-a9495c4e10bd-tmp\") pod \"klusterlet-addon-workmgr-6d8765bb79-7qhjl\" (UID: \"721c8d52-805f-48c7-865c-a9495c4e10bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:12.312869 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.312756 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/721c8d52-805f-48c7-865c-a9495c4e10bd-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d8765bb79-7qhjl\" (UID: \"721c8d52-805f-48c7-865c-a9495c4e10bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:12.312869 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.312780 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4m4l\" (UniqueName: \"kubernetes.io/projected/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-kube-api-access-d4m4l\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.312869 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.312802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-hub\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.312869 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.312830 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.312869 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.312858 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-ca\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.313151 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.312888 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.313701 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.313673 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/721c8d52-805f-48c7-865c-a9495c4e10bd-tmp\") pod \"klusterlet-addon-workmgr-6d8765bb79-7qhjl\" (UID: \"721c8d52-805f-48c7-865c-a9495c4e10bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:12.315237 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.315218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/721c8d52-805f-48c7-865c-a9495c4e10bd-klusterlet-config\") pod \"klusterlet-addon-workmgr-6d8765bb79-7qhjl\" (UID: \"721c8d52-805f-48c7-865c-a9495c4e10bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:12.321661 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.321632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft6r4\" (UniqueName: \"kubernetes.io/projected/721c8d52-805f-48c7-865c-a9495c4e10bd-kube-api-access-ft6r4\") pod \"klusterlet-addon-workmgr-6d8765bb79-7qhjl\" (UID: \"721c8d52-805f-48c7-865c-a9495c4e10bd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:12.414004 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.413962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4m4l\" (UniqueName: \"kubernetes.io/projected/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-kube-api-access-d4m4l\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.414004 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.414015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-hub\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.414239 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.414043 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.414239 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.414065 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-ca\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.414239 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.414082 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.414239 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.414103 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.415508 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.415477 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.416867 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.416830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.416956 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.416919 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-hub\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.416956 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.416929 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-ca\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.417029 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.416980 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.421420 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.421400 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4m4l\" (UniqueName: \"kubernetes.io/projected/33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c-kube-api-access-d4m4l\") pod \"cluster-proxy-proxy-agent-67bdfff7d8-m9hb4\" (UID: \"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.426253 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.426236 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp" Apr 24 23:57:12.499660 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.499633 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:12.540202 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.540172 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" Apr 24 23:57:12.546037 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.545951 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp"] Apr 24 23:57:12.550714 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:57:12.550682 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f8f1362_41cd_4c6a_9283_5705f3000e5d.slice/crio-cf4b425e962122b7cc8f7bacc393a85771dec227380dd457b6b402da04bdc280 WatchSource:0}: Error finding container cf4b425e962122b7cc8f7bacc393a85771dec227380dd457b6b402da04bdc280: Status 404 returned error can't find the container with id cf4b425e962122b7cc8f7bacc393a85771dec227380dd457b6b402da04bdc280 Apr 24 23:57:12.626508 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.626478 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl"] Apr 24 23:57:12.629545 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:57:12.629512 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod721c8d52_805f_48c7_865c_a9495c4e10bd.slice/crio-2f6cab2660693354bd846518249ac803163e4ec858bb6450801145513aca9f5a WatchSource:0}: Error finding container 2f6cab2660693354bd846518249ac803163e4ec858bb6450801145513aca9f5a: Status 404 returned error can't find the container with id 2f6cab2660693354bd846518249ac803163e4ec858bb6450801145513aca9f5a Apr 24 23:57:12.675741 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:12.675673 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4"] Apr 24 23:57:12.678599 ip-10-0-142-30 kubenswrapper[2569]: W0424 23:57:12.678574 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33ab6c2e_7c41_46fc_82d9_6fd9ab553d0c.slice/crio-41dca8ef045c48fe3a3c57efd38b3264b3ab64ed56ab923732715f697356973e WatchSource:0}: Error finding container 41dca8ef045c48fe3a3c57efd38b3264b3ab64ed56ab923732715f697356973e: Status 404 returned error can't find the container with id 41dca8ef045c48fe3a3c57efd38b3264b3ab64ed56ab923732715f697356973e Apr 24 23:57:13.421787 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:13.421748 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp" event={"ID":"9f8f1362-41cd-4c6a-9283-5705f3000e5d","Type":"ContainerStarted","Data":"cf4b425e962122b7cc8f7bacc393a85771dec227380dd457b6b402da04bdc280"} Apr 24 23:57:13.423177 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:13.423138 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" event={"ID":"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c","Type":"ContainerStarted","Data":"41dca8ef045c48fe3a3c57efd38b3264b3ab64ed56ab923732715f697356973e"} Apr 24 23:57:13.424548 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:13.424508 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" event={"ID":"721c8d52-805f-48c7-865c-a9495c4e10bd","Type":"ContainerStarted","Data":"2f6cab2660693354bd846518249ac803163e4ec858bb6450801145513aca9f5a"} Apr 24 23:57:14.113364 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:57:14.113303 2569 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/25/25cc66d094d3cc0b8a4c5aa961b1875071111305152e65c7795f04a2c9b1e4df?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260424%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260424T235712Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=ed88b14f989a47564f9e8b3c4699f9dccdb6e39ad1ab755ec2c4f48d12fe9a88®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=multicluster-engine----multicloud-manager-rhel9&akamai_signature=exp=1777075932~hmac=1e7640f098bf7177f478df22e03a3b50b7e18c38db167b3ed5adf75d31e7fe11\": remote error: tls: internal error; artifact err: provided artifact is a container image" image="registry.redhat.io/multicluster-engine/multicloud-manager-rhel9@sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253" Apr 24 23:57:14.113676 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:57:14.113591 2569 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:acm-agent,Image:registry.redhat.io/multicluster-engine/multicloud-manager-rhel9@sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253,Command:[],Args:[/agent --port=4443 --agent-port=443 --hub-kubeconfig=/var/run/klusterlet/kubeconfig --cluster-name=f26f573c-7668-491d-8e11-eecd093e4374 --agent-name=klusterlet-addon-workmgr],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tmp,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:klusterlet-config,ReadOnly:false,MountPath:/var/run/klusterlet,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ft6r4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000610000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod klusterlet-addon-workmgr-6d8765bb79-7qhjl_open-cluster-management-agent-addon(721c8d52-805f-48c7-865c-a9495c4e10bd): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/25/25cc66d094d3cc0b8a4c5aa961b1875071111305152e65c7795f04a2c9b1e4df?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260424%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260424T235712Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=ed88b14f989a47564f9e8b3c4699f9dccdb6e39ad1ab755ec2c4f48d12fe9a88®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=multicluster-engine----multicloud-manager-rhel9&akamai_signature=exp=1777075932~hmac=1e7640f098bf7177f478df22e03a3b50b7e18c38db167b3ed5adf75d31e7fe11\": remote error: tls: internal error; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 24 23:57:14.115068 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:57:14.115023 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/25/25cc66d094d3cc0b8a4c5aa961b1875071111305152e65c7795f04a2c9b1e4df?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260424%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260424T235712Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=ed88b14f989a47564f9e8b3c4699f9dccdb6e39ad1ab755ec2c4f48d12fe9a88®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=multicluster-engine----multicloud-manager-rhel9&akamai_signature=exp=1777075932~hmac=1e7640f098bf7177f478df22e03a3b50b7e18c38db167b3ed5adf75d31e7fe11\\\": remote error: tls: internal error; artifact err: provided artifact is a container image\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" podUID="721c8d52-805f-48c7-865c-a9495c4e10bd" Apr 24 23:57:14.429847 ip-10-0-142-30 kubenswrapper[2569]: E0424 23:57:14.429802 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/multicluster-engine/multicloud-manager-rhel9@sha256:78586c172f6852b6656ff964750750f86496d5d04accefb3e857458ef198e253\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/25/25cc66d094d3cc0b8a4c5aa961b1875071111305152e65c7795f04a2c9b1e4df?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260424%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260424T235712Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=ed88b14f989a47564f9e8b3c4699f9dccdb6e39ad1ab755ec2c4f48d12fe9a88®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=multicluster-engine----multicloud-manager-rhel9&akamai_signature=exp=1777075932~hmac=1e7640f098bf7177f478df22e03a3b50b7e18c38db167b3ed5adf75d31e7fe11\\\": remote error: tls: internal error; artifact err: provided artifact is a container image\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" podUID="721c8d52-805f-48c7-865c-a9495c4e10bd" Apr 24 23:57:16.438813 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:16.438775 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp" event={"ID":"9f8f1362-41cd-4c6a-9283-5705f3000e5d","Type":"ContainerStarted","Data":"ed4904f3b447919932392178fcd5627c29984eadea20d0e3f0d4f0f99ad962ca"} Apr 24 23:57:16.440042 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:16.440016 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" event={"ID":"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c","Type":"ContainerStarted","Data":"d5447c6724207fc2414e6c01fc0dead05e116b720f7e9d9f256263cee59eee36"} Apr 24 23:57:16.454633 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:16.454588 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-d64889f-ll9hp" podStartSLOduration=1.082103117 podStartE2EDuration="4.454574298s" podCreationTimestamp="2026-04-24 23:57:12 +0000 UTC" firstStartedPulling="2026-04-24 23:57:12.553298942 +0000 UTC m=+205.543702935" lastFinishedPulling="2026-04-24 23:57:15.925770097 +0000 UTC m=+208.916174116" observedRunningTime="2026-04-24 23:57:16.452890014 +0000 UTC m=+209.443294026" watchObservedRunningTime="2026-04-24 23:57:16.454574298 +0000 UTC m=+209.444978310" Apr 24 23:57:18.447336 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:18.447253 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" event={"ID":"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c","Type":"ContainerStarted","Data":"048fd8a8a73e7d8db660d38dcd2b6f86eaa4c5978023abf01052f8eee0c18a75"} Apr 24 23:57:18.447336 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:18.447293 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" event={"ID":"33ab6c2e-7c41-46fc-82d9-6fd9ab553d0c","Type":"ContainerStarted","Data":"004081be81cefb9eed885fc0712c849e7a634c67ac8c67c4ad01bc92ab3de4d2"} Apr 24 23:57:18.465196 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:18.465135 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67bdfff7d8-m9hb4" podStartSLOduration=1.034606763 podStartE2EDuration="6.465117102s" podCreationTimestamp="2026-04-24 23:57:12 +0000 UTC" firstStartedPulling="2026-04-24 23:57:12.680195211 +0000 UTC m=+205.670599201" lastFinishedPulling="2026-04-24 23:57:18.110705545 +0000 UTC m=+211.101109540" observedRunningTime="2026-04-24 23:57:18.463503908 +0000 UTC m=+211.453907934" watchObservedRunningTime="2026-04-24 23:57:18.465117102 +0000 UTC m=+211.455521116" Apr 24 23:57:41.518170 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:41.518128 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" event={"ID":"721c8d52-805f-48c7-865c-a9495c4e10bd","Type":"ContainerStarted","Data":"7169b24ef655f2b6cccda5445c2fa9fc8a6c2bf98a9caf806d32860607373505"} Apr 24 23:57:41.518605 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:41.518468 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:57:41.520095 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:41.520051 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" podUID="721c8d52-805f-48c7-865c-a9495c4e10bd" containerName="acm-agent" probeResult="failure" output="Get \"http://10.134.0.20:8000/readyz\": dial tcp 10.134.0.20:8000: connect: connection refused" Apr 24 23:57:41.536246 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:41.536191 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" podStartSLOduration=0.728428771 podStartE2EDuration="29.536173568s" podCreationTimestamp="2026-04-24 23:57:12 +0000 UTC" firstStartedPulling="2026-04-24 23:57:12.631722141 +0000 UTC m=+205.622126131" lastFinishedPulling="2026-04-24 23:57:41.439466926 +0000 UTC m=+234.429870928" observedRunningTime="2026-04-24 23:57:41.534199839 +0000 UTC m=+234.524603846" watchObservedRunningTime="2026-04-24 23:57:41.536173568 +0000 UTC m=+234.526577582" Apr 24 23:57:42.521682 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:57:42.521657 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6d8765bb79-7qhjl" Apr 24 23:58:47.436923 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:58:47.436894 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 24 23:58:47.437428 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:58:47.437109 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 24 23:58:47.443966 ip-10-0-142-30 kubenswrapper[2569]: I0424 23:58:47.443947 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 25 00:00:17.229959 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:17.229923 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-545b6b789f-sk9dk"] Apr 25 00:00:29.744992 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.744954 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx"] Apr 25 00:00:29.748328 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.748312 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.750588 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.750556 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:00:29.751254 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.751215 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-b9x42\"" Apr 25 00:00:29.751349 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.751254 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 25 00:00:29.751349 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.751281 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 25 00:00:29.751349 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.751298 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:00:29.756602 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.756583 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx"] Apr 25 00:00:29.824189 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.824113 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bc37e2d-ed58-4a96-8321-b8f778045a33-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.824189 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.824170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcswx\" (UniqueName: \"kubernetes.io/projected/8bc37e2d-ed58-4a96-8321-b8f778045a33-kube-api-access-fcswx\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.824428 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.824206 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bc37e2d-ed58-4a96-8321-b8f778045a33-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.824428 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.824277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bc37e2d-ed58-4a96-8321-b8f778045a33-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.924909 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.924876 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bc37e2d-ed58-4a96-8321-b8f778045a33-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.925084 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.924931 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bc37e2d-ed58-4a96-8321-b8f778045a33-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.925084 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.924957 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcswx\" (UniqueName: \"kubernetes.io/projected/8bc37e2d-ed58-4a96-8321-b8f778045a33-kube-api-access-fcswx\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.925084 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.924980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bc37e2d-ed58-4a96-8321-b8f778045a33-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.925438 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.925417 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bc37e2d-ed58-4a96-8321-b8f778045a33-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.925786 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.925761 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bc37e2d-ed58-4a96-8321-b8f778045a33-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.927471 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.927449 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bc37e2d-ed58-4a96-8321-b8f778045a33-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:29.932641 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:29.932620 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcswx\" (UniqueName: \"kubernetes.io/projected/8bc37e2d-ed58-4a96-8321-b8f778045a33-kube-api-access-fcswx\") pod \"isvc-xgboost-graph-predictor-669d8d6456-xh9nx\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:30.072631 ip-10-0-142-30 kubenswrapper[2569]: E0425 00:00:30.072602 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/logrotate.service\": RecentStats: unable to find data in memory cache]" Apr 25 00:00:30.072765 ip-10-0-142-30 kubenswrapper[2569]: E0425 00:00:30.072639 2569 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/logrotate.service\": RecentStats: unable to find data in memory cache]" Apr 25 00:00:30.079397 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:30.079345 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:00:30.196637 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:30.196458 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx"] Apr 25 00:00:30.199694 ip-10-0-142-30 kubenswrapper[2569]: W0425 00:00:30.199667 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bc37e2d_ed58_4a96_8321_b8f778045a33.slice/crio-32a79b027aa0f8e46bbee307117961c31858057ec171e53b1bf38e9dc0eea33f WatchSource:0}: Error finding container 32a79b027aa0f8e46bbee307117961c31858057ec171e53b1bf38e9dc0eea33f: Status 404 returned error can't find the container with id 32a79b027aa0f8e46bbee307117961c31858057ec171e53b1bf38e9dc0eea33f Apr 25 00:00:30.201441 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:30.201425 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:00:30.993383 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:30.993332 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" event={"ID":"8bc37e2d-ed58-4a96-8321-b8f778045a33","Type":"ContainerStarted","Data":"32a79b027aa0f8e46bbee307117961c31858057ec171e53b1bf38e9dc0eea33f"} Apr 25 00:00:34.003477 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:34.003427 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" event={"ID":"8bc37e2d-ed58-4a96-8321-b8f778045a33","Type":"ContainerStarted","Data":"e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68"} Apr 25 00:00:38.016838 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:38.016803 2569 generic.go:358] "Generic (PLEG): container finished" podID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerID="e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68" exitCode=0 Apr 25 00:00:38.017200 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:38.016877 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" event={"ID":"8bc37e2d-ed58-4a96-8321-b8f778045a33","Type":"ContainerDied","Data":"e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68"} Apr 25 00:00:42.249005 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.248962 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-545b6b789f-sk9dk" podUID="3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" containerName="console" containerID="cri-o://0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220" gracePeriod=15 Apr 25 00:00:42.495836 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.495814 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-545b6b789f-sk9dk_3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd/console/0.log" Apr 25 00:00:42.495958 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.495876 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-545b6b789f-sk9dk" Apr 25 00:00:42.635521 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.635490 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-service-ca\") pod \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " Apr 25 00:00:42.635704 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.635588 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-oauth-serving-cert\") pod \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " Apr 25 00:00:42.635704 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.635609 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-serving-cert\") pod \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " Apr 25 00:00:42.635704 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.635636 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh8h6\" (UniqueName: \"kubernetes.io/projected/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-kube-api-access-rh8h6\") pod \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " Apr 25 00:00:42.635704 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.635667 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-config\") pod \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " Apr 25 00:00:42.635704 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.635700 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-trusted-ca-bundle\") pod \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " Apr 25 00:00:42.635955 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.635725 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-oauth-config\") pod \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\" (UID: \"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd\") " Apr 25 00:00:42.635955 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.635905 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-service-ca" (OuterVolumeSpecName: "service-ca") pod "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" (UID: "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:42.636114 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.636083 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" (UID: "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:42.636489 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.636441 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" (UID: "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:42.636611 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.636584 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-config" (OuterVolumeSpecName: "console-config") pod "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" (UID: "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:42.638455 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.638429 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" (UID: "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:00:42.638746 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.638688 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-kube-api-access-rh8h6" (OuterVolumeSpecName: "kube-api-access-rh8h6") pod "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" (UID: "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd"). InnerVolumeSpecName "kube-api-access-rh8h6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:00:42.638746 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.638710 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" (UID: "3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:00:42.737199 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.737160 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rh8h6\" (UniqueName: \"kubernetes.io/projected/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-kube-api-access-rh8h6\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:00:42.737199 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.737199 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-config\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:00:42.737486 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.737215 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-trusted-ca-bundle\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:00:42.737486 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.737230 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-oauth-config\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:00:42.737486 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.737244 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-service-ca\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:00:42.737486 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.737258 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-oauth-serving-cert\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:00:42.737486 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:42.737272 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd-console-serving-cert\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:00:43.033529 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:43.033452 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-545b6b789f-sk9dk_3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd/console/0.log" Apr 25 00:00:43.033529 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:43.033500 2569 generic.go:358] "Generic (PLEG): container finished" podID="3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" containerID="0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220" exitCode=2 Apr 25 00:00:43.033749 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:43.033546 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-545b6b789f-sk9dk" event={"ID":"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd","Type":"ContainerDied","Data":"0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220"} Apr 25 00:00:43.033749 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:43.033576 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-545b6b789f-sk9dk" event={"ID":"3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd","Type":"ContainerDied","Data":"53be2d5ac9641c7dbbef4850294612767e424e4693a65c4d00bdc6b30bd9afac"} Apr 25 00:00:43.033749 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:43.033584 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-545b6b789f-sk9dk" Apr 25 00:00:43.033749 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:43.033595 2569 scope.go:117] "RemoveContainer" containerID="0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220" Apr 25 00:00:43.044205 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:43.044178 2569 scope.go:117] "RemoveContainer" containerID="0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220" Apr 25 00:00:43.044561 ip-10-0-142-30 kubenswrapper[2569]: E0425 00:00:43.044534 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220\": container with ID starting with 0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220 not found: ID does not exist" containerID="0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220" Apr 25 00:00:43.044669 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:43.044574 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220"} err="failed to get container status \"0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220\": rpc error: code = NotFound desc = could not find container \"0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220\": container with ID starting with 0970ebd9ffe8e8619043954b5a537d7ade347d5b7c2b771d6044ef552adc1220 not found: ID does not exist" Apr 25 00:00:43.067495 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:43.067468 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-545b6b789f-sk9dk"] Apr 25 00:00:43.071196 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:43.071172 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-545b6b789f-sk9dk"] Apr 25 00:00:43.553642 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:00:43.553610 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" path="/var/lib/kubelet/pods/3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd/volumes" Apr 25 00:01:00.090322 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:00.090282 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" event={"ID":"8bc37e2d-ed58-4a96-8321-b8f778045a33","Type":"ContainerStarted","Data":"d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6"} Apr 25 00:01:02.097764 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:02.097731 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" event={"ID":"8bc37e2d-ed58-4a96-8321-b8f778045a33","Type":"ContainerStarted","Data":"2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c"} Apr 25 00:01:02.098130 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:02.097842 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:01:02.115884 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:02.115801 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" podStartSLOduration=1.348209605 podStartE2EDuration="33.115785729s" podCreationTimestamp="2026-04-25 00:00:29 +0000 UTC" firstStartedPulling="2026-04-25 00:00:30.201559552 +0000 UTC m=+403.191963542" lastFinishedPulling="2026-04-25 00:01:01.969135672 +0000 UTC m=+434.959539666" observedRunningTime="2026-04-25 00:01:02.114815081 +0000 UTC m=+435.105219115" watchObservedRunningTime="2026-04-25 00:01:02.115785729 +0000 UTC m=+435.106189744" Apr 25 00:01:03.100435 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:03.100404 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:01:03.101583 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:03.101555 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:01:04.103325 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:04.103287 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:01:09.108160 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:09.108123 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:01:09.108745 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:09.108717 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:01:19.108671 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:19.108624 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:01:29.108778 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:29.108739 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:01:39.108959 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:39.108919 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:01:49.109078 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:49.109041 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:01:59.109564 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:01:59.109529 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:02:39.451043 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:39.450968 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx"] Apr 25 00:02:39.451552 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:39.451284 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kserve-container" containerID="cri-o://d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6" gracePeriod=30 Apr 25 00:02:39.451552 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:39.451341 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kube-rbac-proxy" containerID="cri-o://2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c" gracePeriod=30 Apr 25 00:02:40.381147 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:40.381100 2569 generic.go:358] "Generic (PLEG): container finished" podID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerID="2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c" exitCode=2 Apr 25 00:02:40.381317 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:40.381154 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" event={"ID":"8bc37e2d-ed58-4a96-8321-b8f778045a33","Type":"ContainerDied","Data":"2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c"} Apr 25 00:02:42.885462 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:42.885429 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:02:42.972387 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:42.972331 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bc37e2d-ed58-4a96-8321-b8f778045a33-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"8bc37e2d-ed58-4a96-8321-b8f778045a33\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " Apr 25 00:02:42.972540 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:42.972410 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bc37e2d-ed58-4a96-8321-b8f778045a33-kserve-provision-location\") pod \"8bc37e2d-ed58-4a96-8321-b8f778045a33\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " Apr 25 00:02:42.972540 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:42.972454 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcswx\" (UniqueName: \"kubernetes.io/projected/8bc37e2d-ed58-4a96-8321-b8f778045a33-kube-api-access-fcswx\") pod \"8bc37e2d-ed58-4a96-8321-b8f778045a33\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " Apr 25 00:02:42.972540 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:42.972487 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bc37e2d-ed58-4a96-8321-b8f778045a33-proxy-tls\") pod \"8bc37e2d-ed58-4a96-8321-b8f778045a33\" (UID: \"8bc37e2d-ed58-4a96-8321-b8f778045a33\") " Apr 25 00:02:42.972743 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:42.972718 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bc37e2d-ed58-4a96-8321-b8f778045a33-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8bc37e2d-ed58-4a96-8321-b8f778045a33" (UID: "8bc37e2d-ed58-4a96-8321-b8f778045a33"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:02:42.972804 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:42.972751 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc37e2d-ed58-4a96-8321-b8f778045a33-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "8bc37e2d-ed58-4a96-8321-b8f778045a33" (UID: "8bc37e2d-ed58-4a96-8321-b8f778045a33"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:02:42.974666 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:42.974642 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc37e2d-ed58-4a96-8321-b8f778045a33-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8bc37e2d-ed58-4a96-8321-b8f778045a33" (UID: "8bc37e2d-ed58-4a96-8321-b8f778045a33"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:02:42.974763 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:42.974665 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc37e2d-ed58-4a96-8321-b8f778045a33-kube-api-access-fcswx" (OuterVolumeSpecName: "kube-api-access-fcswx") pod "8bc37e2d-ed58-4a96-8321-b8f778045a33" (UID: "8bc37e2d-ed58-4a96-8321-b8f778045a33"). InnerVolumeSpecName "kube-api-access-fcswx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:02:43.073953 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.073866 2569 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bc37e2d-ed58-4a96-8321-b8f778045a33-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:02:43.073953 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.073899 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bc37e2d-ed58-4a96-8321-b8f778045a33-kserve-provision-location\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:02:43.073953 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.073909 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fcswx\" (UniqueName: \"kubernetes.io/projected/8bc37e2d-ed58-4a96-8321-b8f778045a33-kube-api-access-fcswx\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:02:43.073953 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.073918 2569 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bc37e2d-ed58-4a96-8321-b8f778045a33-proxy-tls\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:02:43.390043 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.390010 2569 generic.go:358] "Generic (PLEG): container finished" podID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerID="d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6" exitCode=0 Apr 25 00:02:43.390205 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.390056 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" event={"ID":"8bc37e2d-ed58-4a96-8321-b8f778045a33","Type":"ContainerDied","Data":"d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6"} Apr 25 00:02:43.390205 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.390085 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" event={"ID":"8bc37e2d-ed58-4a96-8321-b8f778045a33","Type":"ContainerDied","Data":"32a79b027aa0f8e46bbee307117961c31858057ec171e53b1bf38e9dc0eea33f"} Apr 25 00:02:43.390205 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.390093 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx" Apr 25 00:02:43.390309 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.390100 2569 scope.go:117] "RemoveContainer" containerID="2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c" Apr 25 00:02:43.399211 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.399192 2569 scope.go:117] "RemoveContainer" containerID="d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6" Apr 25 00:02:43.406396 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.406376 2569 scope.go:117] "RemoveContainer" containerID="e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68" Apr 25 00:02:43.412466 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.412439 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx"] Apr 25 00:02:43.414213 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.414190 2569 scope.go:117] "RemoveContainer" containerID="2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c" Apr 25 00:02:43.414475 ip-10-0-142-30 kubenswrapper[2569]: E0425 00:02:43.414456 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c\": container with ID starting with 2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c not found: ID does not exist" containerID="2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c" Apr 25 00:02:43.414544 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.414484 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c"} err="failed to get container status \"2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c\": rpc error: code = NotFound desc = could not find container \"2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c\": container with ID starting with 2b434d6b0b4ce8a821e0104cd7b9ae5ae826b10956ac15e328fb8abdd17e6f9c not found: ID does not exist" Apr 25 00:02:43.414544 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.414506 2569 scope.go:117] "RemoveContainer" containerID="d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6" Apr 25 00:02:43.414748 ip-10-0-142-30 kubenswrapper[2569]: E0425 00:02:43.414726 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6\": container with ID starting with d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6 not found: ID does not exist" containerID="d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6" Apr 25 00:02:43.414814 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.414759 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6"} err="failed to get container status \"d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6\": rpc error: code = NotFound desc = could not find container \"d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6\": container with ID starting with d420f8c87ef8f1ca2e6a08c04c0c0834ff2deb70754cc2b74136c194073909a6 not found: ID does not exist" Apr 25 00:02:43.414814 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.414786 2569 scope.go:117] "RemoveContainer" containerID="e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68" Apr 25 00:02:43.415017 ip-10-0-142-30 kubenswrapper[2569]: E0425 00:02:43.415002 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68\": container with ID starting with e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68 not found: ID does not exist" containerID="e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68" Apr 25 00:02:43.415071 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.415024 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68"} err="failed to get container status \"e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68\": rpc error: code = NotFound desc = could not find container \"e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68\": container with ID starting with e99f0fc56bb7d41a2e3acd1283d049ec4cac1aa080a81850c383a3eb8ba64f68 not found: ID does not exist" Apr 25 00:02:43.417946 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.417923 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-xh9nx"] Apr 25 00:02:43.552457 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:02:43.552426 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" path="/var/lib/kubelet/pods/8bc37e2d-ed58-4a96-8321-b8f778045a33/volumes" Apr 25 00:03:47.458440 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:03:47.458413 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:03:47.458902 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:03:47.458554 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:08:47.483200 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:08:47.483168 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:08:47.484289 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:08:47.484268 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:13:47.503422 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:13:47.503395 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:13:47.505484 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:13:47.505459 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:18:47.526204 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:18:47.526177 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:18:47.528523 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:18:47.528501 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:23:47.546865 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:23:47.546768 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:23:47.550244 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:23:47.550220 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:28:47.570042 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:28:47.569930 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:28:47.573123 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:28:47.573100 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:33:47.596191 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:33:47.596093 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:33:47.601060 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:33:47.601038 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:38:47.615819 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:38:47.615792 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:38:47.622045 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:38:47.622025 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:39:39.888654 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.888621 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rp79x/must-gather-wxmg7"] Apr 25 00:39:39.889102 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.888911 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kube-rbac-proxy" Apr 25 00:39:39.889102 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.888921 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kube-rbac-proxy" Apr 25 00:39:39.889102 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.888932 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" containerName="console" Apr 25 00:39:39.889102 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.888937 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" containerName="console" Apr 25 00:39:39.889102 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.888944 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kserve-container" Apr 25 00:39:39.889102 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.888950 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kserve-container" Apr 25 00:39:39.889102 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.888965 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="storage-initializer" Apr 25 00:39:39.889102 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.888970 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="storage-initializer" Apr 25 00:39:39.889102 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.889015 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kube-rbac-proxy" Apr 25 00:39:39.889102 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.889026 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bc37e2d-ed58-4a96-8321-b8f778045a33" containerName="kserve-container" Apr 25 00:39:39.889102 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.889033 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a24722d-c0e1-469a-b8dd-f2c0b3a5e5cd" containerName="console" Apr 25 00:39:39.891873 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.891857 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rp79x/must-gather-wxmg7" Apr 25 00:39:39.894940 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.894911 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rp79x\"/\"default-dockercfg-n4nk7\"" Apr 25 00:39:39.894940 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.894939 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rp79x\"/\"openshift-service-ca.crt\"" Apr 25 00:39:39.895118 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.894921 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rp79x\"/\"kube-root-ca.crt\"" Apr 25 00:39:39.910493 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.910470 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rp79x/must-gather-wxmg7"] Apr 25 00:39:39.984045 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.984002 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzcpg\" (UniqueName: \"kubernetes.io/projected/5e303fd7-9b9a-4dae-b966-e7142894cbfd-kube-api-access-pzcpg\") pod \"must-gather-wxmg7\" (UID: \"5e303fd7-9b9a-4dae-b966-e7142894cbfd\") " pod="openshift-must-gather-rp79x/must-gather-wxmg7" Apr 25 00:39:39.984045 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:39.984038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5e303fd7-9b9a-4dae-b966-e7142894cbfd-must-gather-output\") pod \"must-gather-wxmg7\" (UID: \"5e303fd7-9b9a-4dae-b966-e7142894cbfd\") " pod="openshift-must-gather-rp79x/must-gather-wxmg7" Apr 25 00:39:40.085060 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:40.085032 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzcpg\" (UniqueName: \"kubernetes.io/projected/5e303fd7-9b9a-4dae-b966-e7142894cbfd-kube-api-access-pzcpg\") pod \"must-gather-wxmg7\" (UID: \"5e303fd7-9b9a-4dae-b966-e7142894cbfd\") " pod="openshift-must-gather-rp79x/must-gather-wxmg7" Apr 25 00:39:40.085060 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:40.085067 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5e303fd7-9b9a-4dae-b966-e7142894cbfd-must-gather-output\") pod \"must-gather-wxmg7\" (UID: \"5e303fd7-9b9a-4dae-b966-e7142894cbfd\") " pod="openshift-must-gather-rp79x/must-gather-wxmg7" Apr 25 00:39:40.085429 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:40.085413 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5e303fd7-9b9a-4dae-b966-e7142894cbfd-must-gather-output\") pod \"must-gather-wxmg7\" (UID: \"5e303fd7-9b9a-4dae-b966-e7142894cbfd\") " pod="openshift-must-gather-rp79x/must-gather-wxmg7" Apr 25 00:39:40.092336 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:40.092311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzcpg\" (UniqueName: \"kubernetes.io/projected/5e303fd7-9b9a-4dae-b966-e7142894cbfd-kube-api-access-pzcpg\") pod \"must-gather-wxmg7\" (UID: \"5e303fd7-9b9a-4dae-b966-e7142894cbfd\") " pod="openshift-must-gather-rp79x/must-gather-wxmg7" Apr 25 00:39:40.200768 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:40.200709 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rp79x/must-gather-wxmg7" Apr 25 00:39:40.317514 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:40.317485 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rp79x/must-gather-wxmg7"] Apr 25 00:39:40.320264 ip-10-0-142-30 kubenswrapper[2569]: W0425 00:39:40.320238 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e303fd7_9b9a_4dae_b966_e7142894cbfd.slice/crio-5d3cdfe5db2ca2fa4414ec297366bb9a493fe84309fd602323a9324385fe0bee WatchSource:0}: Error finding container 5d3cdfe5db2ca2fa4414ec297366bb9a493fe84309fd602323a9324385fe0bee: Status 404 returned error can't find the container with id 5d3cdfe5db2ca2fa4414ec297366bb9a493fe84309fd602323a9324385fe0bee Apr 25 00:39:40.321878 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:40.321862 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:39:40.535121 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:40.535030 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rp79x/must-gather-wxmg7" event={"ID":"5e303fd7-9b9a-4dae-b966-e7142894cbfd","Type":"ContainerStarted","Data":"5d3cdfe5db2ca2fa4414ec297366bb9a493fe84309fd602323a9324385fe0bee"} Apr 25 00:39:45.554139 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:45.554101 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rp79x/must-gather-wxmg7" event={"ID":"5e303fd7-9b9a-4dae-b966-e7142894cbfd","Type":"ContainerStarted","Data":"7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a"} Apr 25 00:39:45.554139 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:45.554139 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rp79x/must-gather-wxmg7" event={"ID":"5e303fd7-9b9a-4dae-b966-e7142894cbfd","Type":"ContainerStarted","Data":"1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d"} Apr 25 00:39:45.569145 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:39:45.569098 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rp79x/must-gather-wxmg7" podStartSLOduration=1.998600098 podStartE2EDuration="6.569082943s" podCreationTimestamp="2026-04-25 00:39:39 +0000 UTC" firstStartedPulling="2026-04-25 00:39:40.321990702 +0000 UTC m=+2753.312394693" lastFinishedPulling="2026-04-25 00:39:44.892473542 +0000 UTC m=+2757.882877538" observedRunningTime="2026-04-25 00:39:45.567863327 +0000 UTC m=+2758.558267341" watchObservedRunningTime="2026-04-25 00:39:45.569082943 +0000 UTC m=+2758.559486956" Apr 25 00:40:03.608903 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:03.608868 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e303fd7-9b9a-4dae-b966-e7142894cbfd" containerID="1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d" exitCode=0 Apr 25 00:40:03.609310 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:03.608943 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rp79x/must-gather-wxmg7" event={"ID":"5e303fd7-9b9a-4dae-b966-e7142894cbfd","Type":"ContainerDied","Data":"1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d"} Apr 25 00:40:03.609310 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:03.609251 2569 scope.go:117] "RemoveContainer" containerID="1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d" Apr 25 00:40:03.943912 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:03.943828 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rp79x_must-gather-wxmg7_5e303fd7-9b9a-4dae-b966-e7142894cbfd/gather/0.log" Apr 25 00:40:07.358165 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:07.358136 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bvkc6_753e02b2-3f16-4868-8a5c-824bcd6fe13f/global-pull-secret-syncer/0.log" Apr 25 00:40:07.484201 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:07.484175 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zklf4_8d604d02-f21f-4a15-84ae-2c0a50e8c899/konnectivity-agent/0.log" Apr 25 00:40:07.559956 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:07.559931 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-30.ec2.internal_02bbe2448f51d29d628074b87ed0230a/haproxy/0.log" Apr 25 00:40:09.332941 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.332908 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rp79x/must-gather-wxmg7"] Apr 25 00:40:09.333319 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.333139 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-rp79x/must-gather-wxmg7" podUID="5e303fd7-9b9a-4dae-b966-e7142894cbfd" containerName="copy" containerID="cri-o://7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a" gracePeriod=2 Apr 25 00:40:09.335036 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.334984 2569 status_manager.go:895] "Failed to get status for pod" podUID="5e303fd7-9b9a-4dae-b966-e7142894cbfd" pod="openshift-must-gather-rp79x/must-gather-wxmg7" err="pods \"must-gather-wxmg7\" is forbidden: User \"system:node:ip-10-0-142-30.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rp79x\": no relationship found between node 'ip-10-0-142-30.ec2.internal' and this object" Apr 25 00:40:09.335142 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.335085 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rp79x/must-gather-wxmg7"] Apr 25 00:40:09.557126 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.557106 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rp79x_must-gather-wxmg7_5e303fd7-9b9a-4dae-b966-e7142894cbfd/copy/0.log" Apr 25 00:40:09.557474 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.557457 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rp79x/must-gather-wxmg7" Apr 25 00:40:09.610846 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.610826 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzcpg\" (UniqueName: \"kubernetes.io/projected/5e303fd7-9b9a-4dae-b966-e7142894cbfd-kube-api-access-pzcpg\") pod \"5e303fd7-9b9a-4dae-b966-e7142894cbfd\" (UID: \"5e303fd7-9b9a-4dae-b966-e7142894cbfd\") " Apr 25 00:40:09.610932 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.610868 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5e303fd7-9b9a-4dae-b966-e7142894cbfd-must-gather-output\") pod \"5e303fd7-9b9a-4dae-b966-e7142894cbfd\" (UID: \"5e303fd7-9b9a-4dae-b966-e7142894cbfd\") " Apr 25 00:40:09.612337 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.612308 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e303fd7-9b9a-4dae-b966-e7142894cbfd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5e303fd7-9b9a-4dae-b966-e7142894cbfd" (UID: "5e303fd7-9b9a-4dae-b966-e7142894cbfd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:40:09.612929 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.612913 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e303fd7-9b9a-4dae-b966-e7142894cbfd-kube-api-access-pzcpg" (OuterVolumeSpecName: "kube-api-access-pzcpg") pod "5e303fd7-9b9a-4dae-b966-e7142894cbfd" (UID: "5e303fd7-9b9a-4dae-b966-e7142894cbfd"). InnerVolumeSpecName "kube-api-access-pzcpg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:40:09.625718 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.625700 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rp79x_must-gather-wxmg7_5e303fd7-9b9a-4dae-b966-e7142894cbfd/copy/0.log" Apr 25 00:40:09.626008 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.625984 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e303fd7-9b9a-4dae-b966-e7142894cbfd" containerID="7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a" exitCode=143 Apr 25 00:40:09.626081 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.626039 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rp79x/must-gather-wxmg7" Apr 25 00:40:09.626138 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.626083 2569 scope.go:117] "RemoveContainer" containerID="7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a" Apr 25 00:40:09.633931 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.633913 2569 scope.go:117] "RemoveContainer" containerID="1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d" Apr 25 00:40:09.645735 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.645719 2569 scope.go:117] "RemoveContainer" containerID="7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a" Apr 25 00:40:09.645959 ip-10-0-142-30 kubenswrapper[2569]: E0425 00:40:09.645942 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a\": container with ID starting with 7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a not found: ID does not exist" containerID="7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a" Apr 25 00:40:09.646020 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.645967 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a"} err="failed to get container status \"7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a\": rpc error: code = NotFound desc = could not find container \"7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a\": container with ID starting with 7fe906bf2c2f91d3558708a835c93364dab5c0a69c4a5c3d81d557caafca903a not found: ID does not exist" Apr 25 00:40:09.646020 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.645986 2569 scope.go:117] "RemoveContainer" containerID="1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d" Apr 25 00:40:09.646189 ip-10-0-142-30 kubenswrapper[2569]: E0425 00:40:09.646175 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d\": container with ID starting with 1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d not found: ID does not exist" containerID="1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d" Apr 25 00:40:09.646231 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.646194 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d"} err="failed to get container status \"1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d\": rpc error: code = NotFound desc = could not find container \"1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d\": container with ID starting with 1d5a94b411b32add87228eb80744365f89cc3450da6694bca4decdf5bd7ff87d not found: ID does not exist" Apr 25 00:40:09.711671 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.711641 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pzcpg\" (UniqueName: \"kubernetes.io/projected/5e303fd7-9b9a-4dae-b966-e7142894cbfd-kube-api-access-pzcpg\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:40:09.711671 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:09.711667 2569 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5e303fd7-9b9a-4dae-b966-e7142894cbfd-must-gather-output\") on node \"ip-10-0-142-30.ec2.internal\" DevicePath \"\"" Apr 25 00:40:11.241637 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.241604 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xb82g_94089fb1-ab0c-4039-8198-ad6fb5562f6a/kube-state-metrics/0.log" Apr 25 00:40:11.265842 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.265814 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xb82g_94089fb1-ab0c-4039-8198-ad6fb5562f6a/kube-rbac-proxy-main/0.log" Apr 25 00:40:11.294152 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.294116 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xb82g_94089fb1-ab0c-4039-8198-ad6fb5562f6a/kube-rbac-proxy-self/0.log" Apr 25 00:40:11.360684 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.360651 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-fxrzl_a818f70c-58fa-40be-b78e-11838a11f8b9/monitoring-plugin/0.log" Apr 25 00:40:11.480639 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.480608 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ddw4g_dd8e0c2d-19dc-4d83-a697-7eca0069d620/node-exporter/0.log" Apr 25 00:40:11.504722 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.504659 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ddw4g_dd8e0c2d-19dc-4d83-a697-7eca0069d620/kube-rbac-proxy/0.log" Apr 25 00:40:11.527848 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.527829 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ddw4g_dd8e0c2d-19dc-4d83-a697-7eca0069d620/init-textfile/0.log" Apr 25 00:40:11.552560 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.552533 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e303fd7-9b9a-4dae-b966-e7142894cbfd" path="/var/lib/kubelet/pods/5e303fd7-9b9a-4dae-b966-e7142894cbfd/volumes" Apr 25 00:40:11.627215 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.627189 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hdcq9_ea7d009e-6127-4f50-8b69-48fd33a90311/kube-rbac-proxy-main/0.log" Apr 25 00:40:11.648602 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.648580 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hdcq9_ea7d009e-6127-4f50-8b69-48fd33a90311/kube-rbac-proxy-self/0.log" Apr 25 00:40:11.671345 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.671323 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hdcq9_ea7d009e-6127-4f50-8b69-48fd33a90311/openshift-state-metrics/0.log" Apr 25 00:40:11.900921 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.900893 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9z2ng_ac737ca1-d4f9-4109-9912-5ed5c7876fb4/prometheus-operator/0.log" Apr 25 00:40:11.922320 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.922289 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9z2ng_ac737ca1-d4f9-4109-9912-5ed5c7876fb4/kube-rbac-proxy/0.log" Apr 25 00:40:11.949660 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:11.949630 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-5sb5t_4e0b59e8-7176-4cd4-90c7-cef0e7fe3b4f/prometheus-operator-admission-webhook/0.log" Apr 25 00:40:12.097757 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:12.097727 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58c9db9969-hg2cg_43258f38-cf4f-4e21-b609-e209e07de132/thanos-query/0.log" Apr 25 00:40:12.126809 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:12.126780 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58c9db9969-hg2cg_43258f38-cf4f-4e21-b609-e209e07de132/kube-rbac-proxy-web/0.log" Apr 25 00:40:12.152396 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:12.152305 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58c9db9969-hg2cg_43258f38-cf4f-4e21-b609-e209e07de132/kube-rbac-proxy/0.log" Apr 25 00:40:12.184017 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:12.183992 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58c9db9969-hg2cg_43258f38-cf4f-4e21-b609-e209e07de132/prom-label-proxy/0.log" Apr 25 00:40:12.212145 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:12.212120 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58c9db9969-hg2cg_43258f38-cf4f-4e21-b609-e209e07de132/kube-rbac-proxy-rules/0.log" Apr 25 00:40:12.242930 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:12.242905 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-58c9db9969-hg2cg_43258f38-cf4f-4e21-b609-e209e07de132/kube-rbac-proxy-metrics/0.log" Apr 25 00:40:14.182423 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.182364 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-87zhf_ce25c16f-218e-4c97-a5d9-fc6cb1293ba6/download-server/0.log" Apr 25 00:40:14.699902 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.699869 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4"] Apr 25 00:40:14.700280 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.700261 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e303fd7-9b9a-4dae-b966-e7142894cbfd" containerName="gather" Apr 25 00:40:14.700329 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.700284 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e303fd7-9b9a-4dae-b966-e7142894cbfd" containerName="gather" Apr 25 00:40:14.700329 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.700298 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e303fd7-9b9a-4dae-b966-e7142894cbfd" containerName="copy" Apr 25 00:40:14.700329 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.700306 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e303fd7-9b9a-4dae-b966-e7142894cbfd" containerName="copy" Apr 25 00:40:14.700448 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.700412 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e303fd7-9b9a-4dae-b966-e7142894cbfd" containerName="copy" Apr 25 00:40:14.700448 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.700428 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e303fd7-9b9a-4dae-b966-e7142894cbfd" containerName="gather" Apr 25 00:40:14.705307 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.705286 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.707320 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.707297 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sp6k\"/\"openshift-service-ca.crt\"" Apr 25 00:40:14.708742 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.708717 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sp6k\"/\"kube-root-ca.crt\"" Apr 25 00:40:14.708961 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.708946 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6sp6k\"/\"default-dockercfg-r2q97\"" Apr 25 00:40:14.709775 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.709753 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4"] Apr 25 00:40:14.747128 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.747099 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-sys\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.747247 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.747142 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dqxb\" (UniqueName: \"kubernetes.io/projected/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-kube-api-access-6dqxb\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.747247 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.747164 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-proc\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.747247 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.747180 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-podres\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.747247 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.747203 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-lib-modules\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.847615 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.847578 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dqxb\" (UniqueName: \"kubernetes.io/projected/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-kube-api-access-6dqxb\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.847789 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.847623 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-proc\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.847789 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.847642 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-podres\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.847789 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.847720 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-lib-modules\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.847789 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.847735 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-proc\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.847789 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.847778 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-podres\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.848027 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.847805 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-sys\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.848027 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.847842 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-lib-modules\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.848027 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.847883 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-sys\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:14.854751 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:14.854727 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dqxb\" (UniqueName: \"kubernetes.io/projected/dd7a0c50-a707-4c69-88c5-5ab40f57a28d-kube-api-access-6dqxb\") pod \"perf-node-gather-daemonset-48xc4\" (UID: \"dd7a0c50-a707-4c69-88c5-5ab40f57a28d\") " pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:15.016017 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:15.015923 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:15.138903 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:15.138870 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4"] Apr 25 00:40:15.142455 ip-10-0-142-30 kubenswrapper[2569]: W0425 00:40:15.142425 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddd7a0c50_a707_4c69_88c5_5ab40f57a28d.slice/crio-dcc7422a92534ca0ecd3a0995fa0e72f7deadda8d1322c545ea862b4e8715a57 WatchSource:0}: Error finding container dcc7422a92534ca0ecd3a0995fa0e72f7deadda8d1322c545ea862b4e8715a57: Status 404 returned error can't find the container with id dcc7422a92534ca0ecd3a0995fa0e72f7deadda8d1322c545ea862b4e8715a57 Apr 25 00:40:15.221107 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:15.221081 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fcfzm_e80ea08a-a6a9-4898-a4aa-e2a17c1e990f/dns/0.log" Apr 25 00:40:15.240464 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:15.240433 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fcfzm_e80ea08a-a6a9-4898-a4aa-e2a17c1e990f/kube-rbac-proxy/0.log" Apr 25 00:40:15.373846 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:15.373819 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f9b5d_d49aa07c-0862-4d3d-85c1-e60a04019252/dns-node-resolver/0.log" Apr 25 00:40:15.643536 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:15.643454 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" event={"ID":"dd7a0c50-a707-4c69-88c5-5ab40f57a28d","Type":"ContainerStarted","Data":"4dea6585a060104e9ed44abbd2f42ca0d8830b4b496dec8bac6fe110eaaa82ba"} Apr 25 00:40:15.643536 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:15.643493 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" event={"ID":"dd7a0c50-a707-4c69-88c5-5ab40f57a28d","Type":"ContainerStarted","Data":"dcc7422a92534ca0ecd3a0995fa0e72f7deadda8d1322c545ea862b4e8715a57"} Apr 25 00:40:15.643706 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:15.643581 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:15.658153 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:15.658101 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" podStartSLOduration=1.658087251 podStartE2EDuration="1.658087251s" podCreationTimestamp="2026-04-25 00:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:40:15.65755782 +0000 UTC m=+2788.647961824" watchObservedRunningTime="2026-04-25 00:40:15.658087251 +0000 UTC m=+2788.648491264" Apr 25 00:40:15.822458 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:15.822429 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2skl8_2f91e8cd-3f4c-4228-8c96-94fcc37e0124/node-ca/0.log" Apr 25 00:40:16.903461 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:16.903432 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qm5m4_5dbdb447-15bf-4e90-abe6-da3abc588e4a/serve-healthcheck-canary/0.log" Apr 25 00:40:17.287007 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:17.286930 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jcrnq_7a1d9e52-82af-467c-b492-7a7aa2f20acb/kube-rbac-proxy/0.log" Apr 25 00:40:17.319966 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:17.319945 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jcrnq_7a1d9e52-82af-467c-b492-7a7aa2f20acb/exporter/0.log" Apr 25 00:40:17.340419 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:17.340382 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jcrnq_7a1d9e52-82af-467c-b492-7a7aa2f20acb/extractor/0.log" Apr 25 00:40:21.654991 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:21.654960 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6sp6k/perf-node-gather-daemonset-48xc4" Apr 25 00:40:23.362992 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:23.362965 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-59slv_b323d79b-386c-4e79-91fe-2cdaf82ab912/migrator/0.log" Apr 25 00:40:23.382918 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:23.382896 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-59slv_b323d79b-386c-4e79-91fe-2cdaf82ab912/graceful-termination/0.log" Apr 25 00:40:25.144352 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:25.144326 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5fs9_bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8/kube-multus-additional-cni-plugins/0.log" Apr 25 00:40:25.165457 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:25.165430 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5fs9_bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8/egress-router-binary-copy/0.log" Apr 25 00:40:25.184268 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:25.184243 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5fs9_bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8/cni-plugins/0.log" Apr 25 00:40:25.204334 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:25.204308 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5fs9_bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8/bond-cni-plugin/0.log" Apr 25 00:40:25.223744 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:25.223723 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5fs9_bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8/routeoverride-cni/0.log" Apr 25 00:40:25.243455 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:25.243433 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5fs9_bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8/whereabouts-cni-bincopy/0.log" Apr 25 00:40:25.262278 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:25.262257 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5fs9_bbc3d61a-f6ee-4df6-be84-8ee36cb3e6d8/whereabouts-cni/0.log" Apr 25 00:40:25.288581 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:25.288553 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c8t2m_af315a28-630d-4b83-bf3b-2b3421fa929f/kube-multus/0.log" Apr 25 00:40:25.466546 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:25.466465 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z72wj_865ada3d-5576-4faa-98c1-2b867558ffc0/network-metrics-daemon/0.log" Apr 25 00:40:25.486917 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:25.486896 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z72wj_865ada3d-5576-4faa-98c1-2b867558ffc0/kube-rbac-proxy/0.log" Apr 25 00:40:26.912320 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:26.912285 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-controller/0.log" Apr 25 00:40:26.931171 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:26.931133 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/0.log" Apr 25 00:40:26.946363 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:26.946340 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovn-acl-logging/1.log" Apr 25 00:40:26.963531 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:26.963507 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/kube-rbac-proxy-node/0.log" Apr 25 00:40:26.984124 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:26.984105 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/kube-rbac-proxy-ovn-metrics/0.log" Apr 25 00:40:27.003589 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:27.003573 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/northd/0.log" Apr 25 00:40:27.022642 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:27.022629 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/nbdb/0.log" Apr 25 00:40:27.043155 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:27.043139 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/sbdb/0.log" Apr 25 00:40:27.135694 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:27.135674 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v9qcn_9c43aa69-d515-46b6-8b76-dd50b64985c6/ovnkube-controller/0.log" Apr 25 00:40:28.153346 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:28.153314 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zsqnv_2bca4834-ef71-4be6-ac75-bf2bc736877d/network-check-target-container/0.log" Apr 25 00:40:29.073350 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:29.073321 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-b9sll_ed715705-a5a4-4ee1-b874-58c1cb13ea71/iptables-alerter/0.log" Apr 25 00:40:29.645633 ip-10-0-142-30 kubenswrapper[2569]: I0425 00:40:29.645604 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-fmgbl_824d5c7b-4cdc-426b-9c01-0331c41c1293/tuned/0.log"