Apr 24 22:27:27.876888 ip-10-0-142-173 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 22:27:27.876902 ip-10-0-142-173 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 22:27:27.876912 ip-10-0-142-173 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 22:27:27.877244 ip-10-0-142-173 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 22:27:38.070830 ip-10-0-142-173 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 22:27:38.070845 ip-10-0-142-173 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot abaa03b8617346b78242e4b343ada8ce -- Apr 24 22:30:07.893905 ip-10-0-142-173 systemd[1]: Starting Kubernetes Kubelet... Apr 24 22:30:08.319320 ip-10-0-142-173 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:30:08.319320 ip-10-0-142-173 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 22:30:08.319320 ip-10-0-142-173 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:30:08.319320 ip-10-0-142-173 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 22:30:08.319320 ip-10-0-142-173 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:30:08.321635 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.321520 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 22:30:08.327079 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327052 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:08.327079 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327074 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:08.327079 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327078 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:08.327079 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327082 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:08.327079 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327085 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:08.327079 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327088 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327091 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327093 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327098 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327102 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327105 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327108 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327111 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327113 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327115 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327118 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327120 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327123 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327125 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327127 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327130 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327133 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327135 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327138 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:08.327313 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327144 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327147 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327150 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327152 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327155 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327157 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327160 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327162 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327165 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327167 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327169 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327172 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327174 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327177 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327179 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327182 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327185 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327188 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327191 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327193 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:08.327791 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327196 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327198 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327201 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327204 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327206 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327209 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327211 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327214 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327216 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327219 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327223 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327226 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327230 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327233 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327235 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327238 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327240 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327243 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327245 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:08.328279 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327248 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327250 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327253 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327255 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327257 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327260 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327262 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327265 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327269 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327272 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327275 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327277 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327280 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327282 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327285 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327294 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327299 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327302 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327304 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327307 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:08.328760 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327310 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327313 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327315 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327762 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327768 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327771 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327773 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327776 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327778 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327781 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327783 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327786 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327789 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327792 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327794 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327797 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327800 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327803 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327805 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327808 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:08.329239 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327811 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327814 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327816 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327819 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327821 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327824 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327827 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327830 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327833 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327836 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327839 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327841 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327844 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327846 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327849 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327851 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327853 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327856 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327858 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:08.329743 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327861 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327864 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327866 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327869 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327871 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327875 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327877 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327880 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327883 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327885 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327890 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327894 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327897 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327900 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327903 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327906 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327909 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327911 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327914 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327916 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:08.330230 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327919 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327921 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327924 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327926 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327929 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327931 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327934 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327936 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327939 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327941 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327943 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327946 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327948 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327951 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327953 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327956 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327958 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327961 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327964 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327967 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:08.330745 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327969 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327972 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327974 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327977 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327981 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327984 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327987 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327989 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327992 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.327994 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328744 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328753 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328763 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328772 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328778 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328781 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328786 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328790 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328793 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328796 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328799 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 22:30:08.331249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328803 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328806 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328808 2573 flags.go:64] FLAG: --cgroup-root="" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328811 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328814 2573 flags.go:64] FLAG: --client-ca-file="" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328817 2573 flags.go:64] FLAG: --cloud-config="" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328819 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328822 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328827 2573 flags.go:64] FLAG: --cluster-domain="" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328830 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328836 2573 flags.go:64] FLAG: --config-dir="" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328839 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328842 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328847 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328850 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328853 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328856 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328859 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328863 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328865 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328868 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328871 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328875 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328878 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328881 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 22:30:08.331799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328884 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328888 2573 flags.go:64] FLAG: --enable-server="true" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328891 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328895 2573 flags.go:64] FLAG: --event-burst="100" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328898 2573 flags.go:64] FLAG: --event-qps="50" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328901 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328904 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328908 2573 flags.go:64] FLAG: --eviction-hard="" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328912 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328915 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328917 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328920 2573 flags.go:64] FLAG: --eviction-soft="" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328923 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328926 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328929 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328932 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328935 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328939 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328942 2573 flags.go:64] FLAG: --feature-gates="" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328946 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328949 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328952 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328955 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328958 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328961 2573 flags.go:64] FLAG: --help="false" Apr 24 22:30:08.332401 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328964 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328967 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328970 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328973 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328976 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328979 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328982 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328984 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328987 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328995 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.328998 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329001 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329004 2573 flags.go:64] FLAG: --kube-reserved="" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329006 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329010 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329013 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329016 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329019 2573 flags.go:64] FLAG: --lock-file="" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329022 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329024 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329027 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329033 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329036 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329039 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 22:30:08.333016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329043 2573 flags.go:64] FLAG: --logging-format="text" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329046 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329049 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329052 2573 flags.go:64] FLAG: --manifest-url="" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329055 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329059 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329062 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329066 2573 flags.go:64] FLAG: --max-pods="110" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329069 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329072 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329075 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329078 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329081 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329084 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329086 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329094 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329097 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329100 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329103 2573 flags.go:64] FLAG: --pod-cidr="" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329106 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329112 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329114 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329119 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329122 2573 flags.go:64] FLAG: --port="10250" Apr 24 22:30:08.333654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329125 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329128 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0349efd07c8f0aa51" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329130 2573 flags.go:64] FLAG: --qos-reserved="" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329133 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329136 2573 flags.go:64] FLAG: --register-node="true" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329139 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329142 2573 flags.go:64] FLAG: --register-with-taints="" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329145 2573 flags.go:64] FLAG: --registry-burst="10" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329148 2573 flags.go:64] FLAG: --registry-qps="5" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329152 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329155 2573 flags.go:64] FLAG: --reserved-memory="" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329158 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329161 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329164 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329167 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329170 2573 flags.go:64] FLAG: --runonce="false" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329173 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329176 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329178 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329181 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329184 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329186 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329189 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329192 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329195 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329198 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 22:30:08.334234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329200 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329204 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329207 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329210 2573 flags.go:64] FLAG: --system-cgroups="" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329214 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329219 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329222 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329225 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329229 2573 flags.go:64] FLAG: --tls-min-version="" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329232 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329234 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329237 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329240 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329243 2573 flags.go:64] FLAG: --v="2" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329248 2573 flags.go:64] FLAG: --version="false" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329254 2573 flags.go:64] FLAG: --vmodule="" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329258 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329261 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329359 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329363 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329366 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329370 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329373 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:08.334882 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329376 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329379 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329381 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329384 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329387 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329389 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329392 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329395 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329397 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329400 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329402 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329405 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329408 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329411 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329413 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329415 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329418 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329420 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329423 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329425 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:08.335428 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329427 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329430 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329432 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329435 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329438 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329440 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329442 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329445 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329447 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329449 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329452 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329454 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329456 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329459 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329461 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329464 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329466 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329468 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329470 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:08.335972 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329473 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329475 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329478 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329480 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329482 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329485 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329488 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329490 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329492 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329495 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329497 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329500 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329502 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329505 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329507 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329509 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329512 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329515 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329517 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329521 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:08.336434 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329524 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329526 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329530 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329533 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329536 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329539 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329542 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329544 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329547 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329549 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329552 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329554 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329556 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329559 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329562 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329564 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329567 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329569 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329572 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:08.336986 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329574 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329577 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.329580 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.329588 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.336030 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.336048 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336115 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336120 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336124 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336127 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336130 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336132 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336135 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336138 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336141 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:08.337461 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336144 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336146 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336149 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336153 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336157 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336160 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336163 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336166 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336169 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336172 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336176 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336179 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336182 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336184 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336187 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336189 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336192 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336194 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336197 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336200 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:08.337858 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336202 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336205 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336212 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336215 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336218 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336220 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336223 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336226 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336228 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336230 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336233 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336236 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336238 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336241 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336243 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336246 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336248 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336251 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336253 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336255 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:08.338391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336258 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336260 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336263 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336265 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336268 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336270 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336272 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336275 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336277 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336280 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336283 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336287 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336290 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336292 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336296 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336299 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336302 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336304 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336307 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:08.338898 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336309 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336312 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336314 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336316 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336319 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336321 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336323 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336326 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336328 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336331 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336333 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336335 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336338 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336340 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336343 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336345 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336348 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:08.339364 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336351 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.336355 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336452 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336457 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336460 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336463 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336465 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336468 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336471 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336474 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336477 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336480 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336483 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336486 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336488 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:30:08.339798 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336490 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336493 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336495 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336498 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336500 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336503 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336505 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336508 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336510 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336513 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336515 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336517 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336520 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336523 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336525 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336527 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336530 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336532 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336535 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336537 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:30:08.340168 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336540 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336542 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336544 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336547 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336549 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336551 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336554 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336556 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336559 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336561 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336564 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336566 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336569 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336571 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336573 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336576 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336578 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336580 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336583 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336585 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:30:08.340656 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336588 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336590 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336611 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336614 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336617 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336619 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336622 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336624 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336626 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336629 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336632 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336636 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336639 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336642 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336645 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336648 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336651 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336653 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336656 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:30:08.341141 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336658 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336661 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336663 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336666 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336670 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336672 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336675 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336677 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336680 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336682 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336684 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336687 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336689 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:08.336692 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.336698 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:30:08.341662 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.337372 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 22:30:08.342037 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.341031 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 22:30:08.342037 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.342033 2573 server.go:1019] "Starting client certificate rotation" Apr 24 22:30:08.342148 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.342131 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:30:08.342181 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.342174 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:30:08.369533 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.369511 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:30:08.372354 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.372333 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:30:08.387019 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.386997 2573 log.go:25] "Validated CRI v1 runtime API" Apr 24 22:30:08.392328 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.392305 2573 log.go:25] "Validated CRI v1 image API" Apr 24 22:30:08.394229 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.394203 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 22:30:08.396243 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.396227 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:30:08.398851 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.398821 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a9203929-7c23-4f6e-9696-dcac7b0e6d8f:/dev/nvme0n1p3 ca184616-7c7b-43b8-a58c-c089d21290a5:/dev/nvme0n1p4] Apr 24 22:30:08.398897 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.398850 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 22:30:08.404550 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.404424 2573 manager.go:217] Machine: {Timestamp:2026-04-24 22:30:08.402566453 +0000 UTC m=+0.393888344 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3089227 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28c337f17dd87d98e1bff744f11520 SystemUUID:ec28c337-f17d-d87d-98e1-bff744f11520 BootID:abaa03b8-6173-46b7-8242-e4b343ada8ce Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:02:ef:18:f5:bb Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:02:ef:18:f5:bb Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:5b:58:02:ae:d9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 22:30:08.404550 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.404545 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 22:30:08.404705 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.404691 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 22:30:08.405753 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.405726 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 22:30:08.405930 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.405755 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-173.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 22:30:08.405977 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.405940 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 22:30:08.405977 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.405949 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 22:30:08.405977 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.405962 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:30:08.406803 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.406793 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:30:08.407615 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.407590 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:30:08.407787 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.407778 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 22:30:08.410107 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.410096 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 24 22:30:08.410665 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.410654 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 22:30:08.410705 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.410676 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 22:30:08.410705 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.410685 2573 kubelet.go:397] "Adding apiserver pod source" Apr 24 22:30:08.410705 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.410693 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 22:30:08.411782 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.411764 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:30:08.411782 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.411784 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:30:08.416333 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.416316 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 22:30:08.417719 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.417702 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 22:30:08.419317 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419296 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 22:30:08.419317 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419314 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 22:30:08.419317 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419321 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 22:30:08.419456 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419326 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 22:30:08.419456 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419332 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 22:30:08.419456 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419338 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 22:30:08.419456 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419344 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 22:30:08.419456 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419350 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 22:30:08.419456 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419357 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 22:30:08.419456 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419363 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 22:30:08.419456 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419380 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 22:30:08.419456 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.419389 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 22:30:08.420188 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.420178 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 22:30:08.420188 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.420188 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 22:30:08.424363 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.424349 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 22:30:08.424436 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.424389 2573 server.go:1295] "Started kubelet" Apr 24 22:30:08.424499 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.424462 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 22:30:08.424578 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.424543 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 22:30:08.424640 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.424629 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 22:30:08.428258 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.428236 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hn4g4" Apr 24 22:30:08.428450 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.428431 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 22:30:08.428711 ip-10-0-142-173 systemd[1]: Started Kubernetes Kubelet. Apr 24 22:30:08.429844 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.429831 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-173.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 22:30:08.430089 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.430050 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 22:30:08.430192 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.430092 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-173.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 22:30:08.431187 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.431166 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 24 22:30:08.433191 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.432189 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-173.ec2.internal.18a96b9350349ff2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-173.ec2.internal,UID:ip-10-0-142-173.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-173.ec2.internal,},FirstTimestamp:2026-04-24 22:30:08.42436197 +0000 UTC m=+0.415683864,LastTimestamp:2026-04-24 22:30:08.42436197 +0000 UTC m=+0.415683864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-173.ec2.internal,}" Apr 24 22:30:08.434369 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.434349 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hn4g4" Apr 24 22:30:08.434691 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.434643 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 22:30:08.434691 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.434651 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 22:30:08.438223 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.435562 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:08.438223 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.435835 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 22:30:08.438223 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.436777 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 22:30:08.438223 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.436795 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 22:30:08.438223 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.436955 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 24 22:30:08.438223 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.436964 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 24 22:30:08.438223 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.437023 2573 factory.go:55] Registering systemd factory Apr 24 22:30:08.438223 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.437033 2573 factory.go:223] Registration of the systemd container factory successfully Apr 24 22:30:08.438632 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.438320 2573 factory.go:153] Registering CRI-O factory Apr 24 22:30:08.438632 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.438336 2573 factory.go:223] Registration of the crio container factory successfully Apr 24 22:30:08.438632 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.438391 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 22:30:08.438632 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.438417 2573 factory.go:103] Registering Raw factory Apr 24 22:30:08.438632 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.438432 2573 manager.go:1196] Started watching for new ooms in manager Apr 24 22:30:08.438854 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.438834 2573 manager.go:319] Starting recovery of all containers Apr 24 22:30:08.439800 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.439775 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 22:30:08.446697 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.446674 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:08.448177 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.448123 2573 manager.go:324] Recovery completed Apr 24 22:30:08.451843 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.451830 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:08.452297 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.452281 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-173.ec2.internal\" not found" node="ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.454193 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.454180 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:08.454245 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.454206 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:08.454245 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.454216 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:08.454670 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.454659 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 22:30:08.454710 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.454671 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 22:30:08.454710 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.454697 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:30:08.456759 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.456747 2573 policy_none.go:49] "None policy: Start" Apr 24 22:30:08.456804 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.456775 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 22:30:08.456804 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.456785 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 24 22:30:08.506578 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.501170 2573 manager.go:341] "Starting Device Plugin manager" Apr 24 22:30:08.506578 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.501260 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 22:30:08.506578 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.501272 2573 server.go:85] "Starting device plugin registration server" Apr 24 22:30:08.506578 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.501493 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 22:30:08.506578 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.501503 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 22:30:08.506578 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.501611 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 22:30:08.506578 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.501701 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 22:30:08.506578 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.501715 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 22:30:08.506578 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.502397 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 22:30:08.506578 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.502431 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:08.561170 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.561126 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 22:30:08.562479 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.562449 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 22:30:08.562479 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.562478 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 22:30:08.562690 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.562497 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 22:30:08.562690 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.562505 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 22:30:08.562690 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.562536 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 22:30:08.566312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.566288 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:08.602230 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.602136 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:08.603658 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.603638 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:08.603762 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.603672 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:08.603762 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.603685 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:08.603762 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.603709 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.612471 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.612450 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.612520 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.612475 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-173.ec2.internal\": node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:08.639213 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.639179 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:08.663388 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.663359 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-173.ec2.internal"] Apr 24 22:30:08.663446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.663432 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:08.664505 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.664488 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:08.664588 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.664516 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:08.664588 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.664527 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:08.665727 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.665715 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:08.665858 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.665843 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.665891 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.665873 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:08.666526 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.666507 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:08.666526 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.666517 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:08.666678 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.666537 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:08.666678 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.666539 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:08.666678 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.666547 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:08.666678 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.666553 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:08.667705 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.667689 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.667777 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.667715 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:30:08.668364 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.668350 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:30:08.668431 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.668375 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:30:08.668431 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.668387 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:30:08.699395 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.699372 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-173.ec2.internal\" not found" node="ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.703787 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.703770 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-173.ec2.internal\" not found" node="ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.739587 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.739556 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:08.739721 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.739650 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/840a721be2697b1c2b72dd11b149c26f-config\") pod \"kube-apiserver-proxy-ip-10-0-142-173.ec2.internal\" (UID: \"840a721be2697b1c2b72dd11b149c26f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.739721 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.739676 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5a0fd0b02318dafd43a8edecf3cbc9ca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal\" (UID: \"5a0fd0b02318dafd43a8edecf3cbc9ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.739721 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.739697 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a0fd0b02318dafd43a8edecf3cbc9ca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal\" (UID: \"5a0fd0b02318dafd43a8edecf3cbc9ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.840095 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.840057 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:08.840224 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.840121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5a0fd0b02318dafd43a8edecf3cbc9ca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal\" (UID: \"5a0fd0b02318dafd43a8edecf3cbc9ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.840224 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.840158 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a0fd0b02318dafd43a8edecf3cbc9ca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal\" (UID: \"5a0fd0b02318dafd43a8edecf3cbc9ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.840224 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.840174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/840a721be2697b1c2b72dd11b149c26f-config\") pod \"kube-apiserver-proxy-ip-10-0-142-173.ec2.internal\" (UID: \"840a721be2697b1c2b72dd11b149c26f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.840315 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.840231 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5a0fd0b02318dafd43a8edecf3cbc9ca-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal\" (UID: \"5a0fd0b02318dafd43a8edecf3cbc9ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.840315 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.840245 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a0fd0b02318dafd43a8edecf3cbc9ca-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal\" (UID: \"5a0fd0b02318dafd43a8edecf3cbc9ca\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.840315 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:08.840269 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/840a721be2697b1c2b72dd11b149c26f-config\") pod \"kube-apiserver-proxy-ip-10-0-142-173.ec2.internal\" (UID: \"840a721be2697b1c2b72dd11b149c26f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-173.ec2.internal" Apr 24 22:30:08.940494 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:08.940407 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:09.000617 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.000567 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" Apr 24 22:30:09.006228 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.006204 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-173.ec2.internal" Apr 24 22:30:09.040910 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:09.040873 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:09.141259 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:09.141220 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:09.241681 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:09.241650 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:09.342086 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.342042 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 22:30:09.342086 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:09.342064 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:09.342871 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.342208 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:30:09.342871 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.342209 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:30:09.435655 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.435628 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 22:30:09.437730 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.437700 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 22:25:08 +0000 UTC" deadline="2028-01-27 00:15:59.430246964 +0000 UTC" Apr 24 22:30:09.437730 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.437727 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15409h45m49.992522787s" Apr 24 22:30:09.443081 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:09.443059 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:09.457340 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.457313 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:30:09.493810 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.493742 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6m8xm" Apr 24 22:30:09.509206 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.509179 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6m8xm" Apr 24 22:30:09.534179 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:09.534132 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0fd0b02318dafd43a8edecf3cbc9ca.slice/crio-8356ad8c35569535786de5f830cdcc55d3cbd6cf9c78dbe2511eac1deef60b72 WatchSource:0}: Error finding container 8356ad8c35569535786de5f830cdcc55d3cbd6cf9c78dbe2511eac1deef60b72: Status 404 returned error can't find the container with id 8356ad8c35569535786de5f830cdcc55d3cbd6cf9c78dbe2511eac1deef60b72 Apr 24 22:30:09.534426 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:09.534406 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod840a721be2697b1c2b72dd11b149c26f.slice/crio-9ddfa6dc29691f8a06154203dbee424a9cdb90120f497029a4a3d72cd0487cd4 WatchSource:0}: Error finding container 9ddfa6dc29691f8a06154203dbee424a9cdb90120f497029a4a3d72cd0487cd4: Status 404 returned error can't find the container with id 9ddfa6dc29691f8a06154203dbee424a9cdb90120f497029a4a3d72cd0487cd4 Apr 24 22:30:09.539306 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.539291 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:30:09.544037 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:09.544018 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:09.565820 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.565765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" event={"ID":"5a0fd0b02318dafd43a8edecf3cbc9ca","Type":"ContainerStarted","Data":"8356ad8c35569535786de5f830cdcc55d3cbd6cf9c78dbe2511eac1deef60b72"} Apr 24 22:30:09.566743 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.566705 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-173.ec2.internal" event={"ID":"840a721be2697b1c2b72dd11b149c26f","Type":"ContainerStarted","Data":"9ddfa6dc29691f8a06154203dbee424a9cdb90120f497029a4a3d72cd0487cd4"} Apr 24 22:30:09.645158 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:09.645126 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:09.745644 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:09.745561 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:09.784049 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:09.784021 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:09.846148 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:09.846107 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:09.946921 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:09.946886 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-173.ec2.internal\" not found" Apr 24 22:30:10.010575 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.010493 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:10.035994 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.035948 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-173.ec2.internal" Apr 24 22:30:10.048446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.048323 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:30:10.049491 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.049257 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" Apr 24 22:30:10.071263 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.071229 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:30:10.325848 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.325767 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:10.412235 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.412200 2573 apiserver.go:52] "Watching apiserver" Apr 24 22:30:10.420484 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.420456 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 22:30:10.423041 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.422961 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-tdfbz","openshift-dns/node-resolver-b7xgj","openshift-network-diagnostics/network-check-target-hqprm","openshift-network-operator/iptables-alerter-fk6tq","openshift-ovn-kubernetes/ovnkube-node-ckgcl","kube-system/kube-apiserver-proxy-ip-10-0-142-173.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd","openshift-cluster-node-tuning-operator/tuned-brsxm","openshift-image-registry/node-ca-9lblg","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal","openshift-multus/multus-7v756","openshift-multus/multus-additional-cni-plugins-mq5d5","openshift-multus/network-metrics-daemon-kdqw9"] Apr 24 22:30:10.424850 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.424824 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.425933 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.425914 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b7xgj" Apr 24 22:30:10.426980 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.426962 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:10.427085 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:10.427028 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:10.428009 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.427988 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fk6tq" Apr 24 22:30:10.429863 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.429846 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:10.430108 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.430049 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 22:30:10.431071 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.431052 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9lblg" Apr 24 22:30:10.431167 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.431130 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.431646 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.431626 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 22:30:10.432034 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.431845 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 22:30:10.432144 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.432127 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 22:30:10.432245 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.432228 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 22:30:10.432902 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.432671 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 22:30:10.432902 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.432696 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:10.432902 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.432711 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 22:30:10.432902 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.432716 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-f49vr\"" Apr 24 22:30:10.432902 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.432710 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gvh2n\"" Apr 24 22:30:10.432902 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.432712 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5r9c4\"" Apr 24 22:30:10.434121 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.433753 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7v756" Apr 24 22:30:10.434121 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.433871 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.434956 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.434934 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:10.435037 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:10.435010 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:10.436130 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.436115 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.436795 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.436779 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 22:30:10.436878 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.436865 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 22:30:10.436924 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.436911 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 22:30:10.437181 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.437170 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 22:30:10.437228 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.437183 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:30:10.437228 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.437198 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 22:30:10.437808 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.437787 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 22:30:10.439162 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.439141 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-crzrv\"" Apr 24 22:30:10.439361 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.439337 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 22:30:10.439560 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.439544 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 22:30:10.439766 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.439745 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 22:30:10.439842 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.439808 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jbwlx\"" Apr 24 22:30:10.440083 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.440064 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 22:30:10.440163 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.440100 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-ztw56\"" Apr 24 22:30:10.440163 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.440154 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8747p\"" Apr 24 22:30:10.440372 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.440105 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 22:30:10.440566 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.440538 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 22:30:10.440864 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.440841 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wwb2r\"" Apr 24 22:30:10.440944 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.440865 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9nvt5\"" Apr 24 22:30:10.441137 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.441121 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 22:30:10.443501 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.443481 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 22:30:10.448753 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.448736 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 22:30:10.448843 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.448759 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-etc-kubernetes\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.448843 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.448791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zj7p\" (UniqueName: \"kubernetes.io/projected/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-kube-api-access-6zj7p\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.448843 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.448816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-run\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.449007 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.448864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-run-systemd\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449007 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.448903 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-run-netns\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.449007 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.448941 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-conf-dir\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.449007 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.448961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:10.449007 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.448990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-systemd-units\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449212 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449032 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-etc-openvswitch\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449212 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449073 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-socket-dir-parent\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.449212 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449091 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-systemd\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.449212 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449106 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qns8g\" (UniqueName: \"kubernetes.io/projected/b114ecc3-3191-4768-a2bc-d878a4044ee3-kube-api-access-qns8g\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:10.449212 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449122 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-run-ovn-kubernetes\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449212 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449142 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-cni-bin\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449212 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449160 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f179e6c5-8a33-48d8-96ce-1400a4dcde57-env-overrides\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449212 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449198 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-os-release\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.449446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449236 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-sys-fs\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.449446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449266 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-sysctl-d\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.449446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449280 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-node-log\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-cni-netd\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449351 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f179e6c5-8a33-48d8-96ce-1400a4dcde57-ovn-node-metrics-cert\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449376 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcpfg\" (UniqueName: \"kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg\") pod \"network-check-target-hqprm\" (UID: \"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48\") " pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:10.449446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449400 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5665ddf8-e176-403e-874a-d3f3d5a59d2e-konnectivity-ca\") pod \"konnectivity-agent-tdfbz\" (UID: \"5665ddf8-e176-403e-874a-d3f3d5a59d2e\") " pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:10.449446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449432 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 22:30:10.449446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-daemon-config\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449465 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449483 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wtgq\" (UniqueName: \"kubernetes.io/projected/adea4cfb-400a-43d8-8b2d-0cd0d88160f5-kube-api-access-9wtgq\") pod \"node-resolver-b7xgj\" (UID: \"adea4cfb-400a-43d8-8b2d-0cd0d88160f5\") " pod="openshift-dns/node-resolver-b7xgj" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449467 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449514 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f09ea6fe-aff1-4e92-a7ca-c70f50d186ec-iptables-alerter-script\") pod \"iptables-alerter-fk6tq\" (UID: \"f09ea6fe-aff1-4e92-a7ca-c70f50d186ec\") " pod="openshift-network-operator/iptables-alerter-fk6tq" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449553 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-log-socket\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449620 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-run-multus-certs\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449645 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-sysconfig\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-lib-modules\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449706 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-host\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449730 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449739 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-tuned\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449770 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-kubelet\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449825 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-run-openvswitch\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.449857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-cni-binary-copy\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-run-k8s-cni-cncf-io\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449931 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-var-lib-cni-bin\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449955 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-cnibin\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449974 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjf2p\" (UniqueName: \"kubernetes.io/projected/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-kube-api-access-kjf2p\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.449994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/adea4cfb-400a-43d8-8b2d-0cd0d88160f5-hosts-file\") pod \"node-resolver-b7xgj\" (UID: \"adea4cfb-400a-43d8-8b2d-0cd0d88160f5\") " pod="openshift-dns/node-resolver-b7xgj" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450044 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2x9d\" (UniqueName: \"kubernetes.io/projected/f09ea6fe-aff1-4e92-a7ca-c70f50d186ec-kube-api-access-c2x9d\") pod \"iptables-alerter-fk6tq\" (UID: \"f09ea6fe-aff1-4e92-a7ca-c70f50d186ec\") " pod="openshift-network-operator/iptables-alerter-fk6tq" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450078 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f179e6c5-8a33-48d8-96ce-1400a4dcde57-ovnkube-config\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450107 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a9aa9ab-1597-4fc6-8210-93b838855a27-host\") pod \"node-ca-9lblg\" (UID: \"1a9aa9ab-1597-4fc6-8210-93b838855a27\") " pod="openshift-image-registry/node-ca-9lblg" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450121 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-system-cni-dir\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450134 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-var-lib-cni-multus\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450151 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-var-lib-kubelet\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450196 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450229 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-device-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450254 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/adea4cfb-400a-43d8-8b2d-0cd0d88160f5-tmp-dir\") pod \"node-resolver-b7xgj\" (UID: \"adea4cfb-400a-43d8-8b2d-0cd0d88160f5\") " pod="openshift-dns/node-resolver-b7xgj" Apr 24 22:30:10.450389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450278 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-kubernetes\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450305 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ch8z\" (UniqueName: \"kubernetes.io/projected/1a9aa9ab-1597-4fc6-8210-93b838855a27-kube-api-access-2ch8z\") pod \"node-ca-9lblg\" (UID: \"1a9aa9ab-1597-4fc6-8210-93b838855a27\") " pod="openshift-image-registry/node-ca-9lblg" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450335 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-hostroot\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450354 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450374 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-569jl\" (UniqueName: \"kubernetes.io/projected/774ca196-c906-4414-b4f1-a64f30625a6e-kube-api-access-569jl\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450402 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f09ea6fe-aff1-4e92-a7ca-c70f50d186ec-host-slash\") pod \"iptables-alerter-fk6tq\" (UID: \"f09ea6fe-aff1-4e92-a7ca-c70f50d186ec\") " pod="openshift-network-operator/iptables-alerter-fk6tq" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450427 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-sysctl-conf\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450450 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-var-lib-kubelet\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-modprobe-d\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450510 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-run-ovn\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450567 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f179e6c5-8a33-48d8-96ce-1400a4dcde57-ovnkube-script-lib\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450617 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g687p\" (UniqueName: \"kubernetes.io/projected/f179e6c5-8a33-48d8-96ce-1400a4dcde57-kube-api-access-g687p\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450642 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a9aa9ab-1597-4fc6-8210-93b838855a27-serviceca\") pod \"node-ca-9lblg\" (UID: \"1a9aa9ab-1597-4fc6-8210-93b838855a27\") " pod="openshift-image-registry/node-ca-9lblg" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450673 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-tmp\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450695 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-run-netns\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450728 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-cnibin\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.451080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450765 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-os-release\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450793 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-socket-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450822 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-registration-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450838 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-etc-selinux\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450860 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450890 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-var-lib-openvswitch\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450918 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5665ddf8-e176-403e-874a-d3f3d5a59d2e-agent-certs\") pod \"konnectivity-agent-tdfbz\" (UID: \"5665ddf8-e176-403e-874a-d3f3d5a59d2e\") " pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.450941 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-cni-dir\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.451009 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-system-cni-dir\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.451033 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-sys\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.451058 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-slash\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.451081 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.451682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.451102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p2gb\" (UniqueName: \"kubernetes.io/projected/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-kube-api-access-2p2gb\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.510554 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.510487 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:25:09 +0000 UTC" deadline="2027-12-09 05:30:13.915489806 +0000 UTC" Apr 24 22:30:10.510554 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.510549 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14239h0m3.404945064s" Apr 24 22:30:10.552156 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-etc-kubernetes\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.552156 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552135 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zj7p\" (UniqueName: \"kubernetes.io/projected/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-kube-api-access-6zj7p\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552163 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-run\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-run-systemd\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-etc-kubernetes\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552242 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-run-systemd\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-run\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-run-netns\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-conf-dir\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552344 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-run-netns\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-systemd-units\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-etc-openvswitch\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.552405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552391 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-socket-dir-parent\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552418 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-systemd\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qns8g\" (UniqueName: \"kubernetes.io/projected/b114ecc3-3191-4768-a2bc-d878a4044ee3-kube-api-access-qns8g\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-conf-dir\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-run-ovn-kubernetes\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:10.552499 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552506 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-run-ovn-kubernetes\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-cni-bin\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552547 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-systemd-units\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f179e6c5-8a33-48d8-96ce-1400a4dcde57-env-overrides\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:10.552616 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs podName:b114ecc3-3191-4768-a2bc-d878a4044ee3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:11.052557183 +0000 UTC m=+3.043879062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs") pod "network-metrics-daemon-kdqw9" (UID: "b114ecc3-3191-4768-a2bc-d878a4044ee3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-socket-dir-parent\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552703 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-systemd\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-etc-openvswitch\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-os-release\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-sys-fs\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-sysctl-d\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.552978 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-node-log\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-cni-netd\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.552960 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f179e6c5-8a33-48d8-96ce-1400a4dcde57-ovn-node-metrics-cert\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553001 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-cni-bin\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553021 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-sys-fs\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553064 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcpfg\" (UniqueName: \"kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg\") pod \"network-check-target-hqprm\" (UID: \"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48\") " pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553082 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-os-release\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-node-log\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5665ddf8-e176-403e-874a-d3f3d5a59d2e-konnectivity-ca\") pod \"konnectivity-agent-tdfbz\" (UID: \"5665ddf8-e176-403e-874a-d3f3d5a59d2e\") " pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553176 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f179e6c5-8a33-48d8-96ce-1400a4dcde57-env-overrides\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-daemon-config\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wtgq\" (UniqueName: \"kubernetes.io/projected/adea4cfb-400a-43d8-8b2d-0cd0d88160f5-kube-api-access-9wtgq\") pod \"node-resolver-b7xgj\" (UID: \"adea4cfb-400a-43d8-8b2d-0cd0d88160f5\") " pod="openshift-dns/node-resolver-b7xgj" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f09ea6fe-aff1-4e92-a7ca-c70f50d186ec-iptables-alerter-script\") pod \"iptables-alerter-fk6tq\" (UID: \"f09ea6fe-aff1-4e92-a7ca-c70f50d186ec\") " pod="openshift-network-operator/iptables-alerter-fk6tq" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553302 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-log-socket\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-run-multus-certs\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-sysconfig\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.553770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-lib-modules\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-host\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553448 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-run-multus-certs\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-tuned\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-kubelet\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-run-openvswitch\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-cni-binary-copy\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553571 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-run-k8s-cni-cncf-io\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553585 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-var-lib-cni-bin\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553620 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-cnibin\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553677 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-log-socket\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-kubelet\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-run-openvswitch\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553757 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553773 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553813 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-cni-netd\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.554312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553867 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjf2p\" (UniqueName: \"kubernetes.io/projected/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-kube-api-access-kjf2p\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553892 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/adea4cfb-400a-43d8-8b2d-0cd0d88160f5-hosts-file\") pod \"node-resolver-b7xgj\" (UID: \"adea4cfb-400a-43d8-8b2d-0cd0d88160f5\") " pod="openshift-dns/node-resolver-b7xgj" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2x9d\" (UniqueName: \"kubernetes.io/projected/f09ea6fe-aff1-4e92-a7ca-c70f50d186ec-kube-api-access-c2x9d\") pod \"iptables-alerter-fk6tq\" (UID: \"f09ea6fe-aff1-4e92-a7ca-c70f50d186ec\") " pod="openshift-network-operator/iptables-alerter-fk6tq" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553938 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f179e6c5-8a33-48d8-96ce-1400a4dcde57-ovnkube-config\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553958 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-sysctl-d\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.553962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a9aa9ab-1597-4fc6-8210-93b838855a27-host\") pod \"node-ca-9lblg\" (UID: \"1a9aa9ab-1597-4fc6-8210-93b838855a27\") " pod="openshift-image-registry/node-ca-9lblg" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-system-cni-dir\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554034 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-var-lib-cni-multus\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554065 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-var-lib-kubelet\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554089 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-device-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554141 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/adea4cfb-400a-43d8-8b2d-0cd0d88160f5-tmp-dir\") pod \"node-resolver-b7xgj\" (UID: \"adea4cfb-400a-43d8-8b2d-0cd0d88160f5\") " pod="openshift-dns/node-resolver-b7xgj" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554167 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-kubernetes\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ch8z\" (UniqueName: \"kubernetes.io/projected/1a9aa9ab-1597-4fc6-8210-93b838855a27-kube-api-access-2ch8z\") pod \"node-ca-9lblg\" (UID: \"1a9aa9ab-1597-4fc6-8210-93b838855a27\") " pod="openshift-image-registry/node-ca-9lblg" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554205 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-cni-binary-copy\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-hostroot\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-run-k8s-cni-cncf-io\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.554940 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554276 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-var-lib-cni-bin\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-569jl\" (UniqueName: \"kubernetes.io/projected/774ca196-c906-4414-b4f1-a64f30625a6e-kube-api-access-569jl\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f09ea6fe-aff1-4e92-a7ca-c70f50d186ec-host-slash\") pod \"iptables-alerter-fk6tq\" (UID: \"f09ea6fe-aff1-4e92-a7ca-c70f50d186ec\") " pod="openshift-network-operator/iptables-alerter-fk6tq" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-sysctl-conf\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-var-lib-kubelet\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-modprobe-d\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-run-ovn\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554433 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5665ddf8-e176-403e-874a-d3f3d5a59d2e-konnectivity-ca\") pod \"konnectivity-agent-tdfbz\" (UID: \"5665ddf8-e176-403e-874a-d3f3d5a59d2e\") " pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554457 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f179e6c5-8a33-48d8-96ce-1400a4dcde57-ovnkube-script-lib\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554501 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-device-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554527 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g687p\" (UniqueName: \"kubernetes.io/projected/f179e6c5-8a33-48d8-96ce-1400a4dcde57-kube-api-access-g687p\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a9aa9ab-1597-4fc6-8210-93b838855a27-serviceca\") pod \"node-ca-9lblg\" (UID: \"1a9aa9ab-1597-4fc6-8210-93b838855a27\") " pod="openshift-image-registry/node-ca-9lblg" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-tmp\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554621 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-run-netns\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554613 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-daemon-config\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554649 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-cnibin\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554715 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-sysconfig\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.556709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.554802 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/adea4cfb-400a-43d8-8b2d-0cd0d88160f5-hosts-file\") pod \"node-resolver-b7xgj\" (UID: \"adea4cfb-400a-43d8-8b2d-0cd0d88160f5\") " pod="openshift-dns/node-resolver-b7xgj" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555346 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555373 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-hostroot\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555439 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/adea4cfb-400a-43d8-8b2d-0cd0d88160f5-tmp-dir\") pod \"node-resolver-b7xgj\" (UID: \"adea4cfb-400a-43d8-8b2d-0cd0d88160f5\") " pod="openshift-dns/node-resolver-b7xgj" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-var-lib-cni-multus\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555450 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a9aa9ab-1597-4fc6-8210-93b838855a27-host\") pod \"node-ca-9lblg\" (UID: \"1a9aa9ab-1597-4fc6-8210-93b838855a27\") " pod="openshift-image-registry/node-ca-9lblg" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555472 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-run-ovn\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555566 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-host\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-modprobe-d\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-run-netns\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555537 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-system-cni-dir\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555694 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-host-var-lib-kubelet\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555758 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-cnibin\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555806 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-kubernetes\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555810 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-sysctl-conf\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f09ea6fe-aff1-4e92-a7ca-c70f50d186ec-host-slash\") pod \"iptables-alerter-fk6tq\" (UID: \"f09ea6fe-aff1-4e92-a7ca-c70f50d186ec\") " pod="openshift-network-operator/iptables-alerter-fk6tq" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.555972 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-lib-modules\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.557534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-os-release\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-socket-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556061 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-var-lib-kubelet\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-registration-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556074 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-os-release\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-etc-selinux\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556148 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1a9aa9ab-1597-4fc6-8210-93b838855a27-serviceca\") pod \"node-ca-9lblg\" (UID: \"1a9aa9ab-1597-4fc6-8210-93b838855a27\") " pod="openshift-image-registry/node-ca-9lblg" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-registration-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556169 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-etc-selinux\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f179e6c5-8a33-48d8-96ce-1400a4dcde57-ovnkube-script-lib\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556215 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-var-lib-openvswitch\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/774ca196-c906-4414-b4f1-a64f30625a6e-socket-dir\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556266 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5665ddf8-e176-403e-874a-d3f3d5a59d2e-agent-certs\") pod \"konnectivity-agent-tdfbz\" (UID: \"5665ddf8-e176-403e-874a-d3f3d5a59d2e\") " pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556291 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-cni-dir\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-system-cni-dir\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556341 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-sys\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.558362 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-slash\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556387 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2p2gb\" (UniqueName: \"kubernetes.io/projected/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-kube-api-access-2p2gb\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556431 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556495 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-var-lib-openvswitch\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556591 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f179e6c5-8a33-48d8-96ce-1400a4dcde57-ovnkube-config\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-cnibin\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-multus-cni-dir\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556765 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f179e6c5-8a33-48d8-96ce-1400a4dcde57-host-slash\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556771 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-sys\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556805 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-system-cni-dir\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.556820 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f179e6c5-8a33-48d8-96ce-1400a4dcde57-ovn-node-metrics-cert\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.557183 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.557946 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-etc-tuned\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.558048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f09ea6fe-aff1-4e92-a7ca-c70f50d186ec-iptables-alerter-script\") pod \"iptables-alerter-fk6tq\" (UID: \"f09ea6fe-aff1-4e92-a7ca-c70f50d186ec\") " pod="openshift-network-operator/iptables-alerter-fk6tq" Apr 24 22:30:10.559099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.558379 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-tmp\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.559641 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.559133 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5665ddf8-e176-403e-874a-d3f3d5a59d2e-agent-certs\") pod \"konnectivity-agent-tdfbz\" (UID: \"5665ddf8-e176-403e-874a-d3f3d5a59d2e\") " pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:10.565784 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.565755 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ch8z\" (UniqueName: \"kubernetes.io/projected/1a9aa9ab-1597-4fc6-8210-93b838855a27-kube-api-access-2ch8z\") pod \"node-ca-9lblg\" (UID: \"1a9aa9ab-1597-4fc6-8210-93b838855a27\") " pod="openshift-image-registry/node-ca-9lblg" Apr 24 22:30:10.566337 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:10.566317 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:10.566379 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:10.566349 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:10.566379 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:10.566372 2573 projected.go:194] Error preparing data for projected volume kube-api-access-kcpfg for pod openshift-network-diagnostics/network-check-target-hqprm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:10.566452 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:10.566441 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg podName:1c5e89d8-8a8e-41eb-a725-89b55ae5ed48 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:11.06641373 +0000 UTC m=+3.057735628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kcpfg" (UniqueName: "kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg") pod "network-check-target-hqprm" (UID: "1c5e89d8-8a8e-41eb-a725-89b55ae5ed48") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:10.570051 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.570024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjf2p\" (UniqueName: \"kubernetes.io/projected/5b1b5f17-6df7-4280-b50e-f0241d9ab7d6-kube-api-access-kjf2p\") pod \"multus-additional-cni-plugins-mq5d5\" (UID: \"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6\") " pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:10.572534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.571928 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-569jl\" (UniqueName: \"kubernetes.io/projected/774ca196-c906-4414-b4f1-a64f30625a6e-kube-api-access-569jl\") pod \"aws-ebs-csi-driver-node-sptcd\" (UID: \"774ca196-c906-4414-b4f1-a64f30625a6e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.572534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.572076 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wtgq\" (UniqueName: \"kubernetes.io/projected/adea4cfb-400a-43d8-8b2d-0cd0d88160f5-kube-api-access-9wtgq\") pod \"node-resolver-b7xgj\" (UID: \"adea4cfb-400a-43d8-8b2d-0cd0d88160f5\") " pod="openshift-dns/node-resolver-b7xgj" Apr 24 22:30:10.572709 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.572574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p2gb\" (UniqueName: \"kubernetes.io/projected/a75bb813-f2e4-4f8e-a0e9-677e2345d5f2-kube-api-access-2p2gb\") pod \"tuned-brsxm\" (UID: \"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2\") " pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.574228 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.574205 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g687p\" (UniqueName: \"kubernetes.io/projected/f179e6c5-8a33-48d8-96ce-1400a4dcde57-kube-api-access-g687p\") pod \"ovnkube-node-ckgcl\" (UID: \"f179e6c5-8a33-48d8-96ce-1400a4dcde57\") " pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.574312 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.574207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zj7p\" (UniqueName: \"kubernetes.io/projected/d413d1a6-f8ca-40a5-90ec-78dff39daaf1-kube-api-access-6zj7p\") pod \"multus-7v756\" (UID: \"d413d1a6-f8ca-40a5-90ec-78dff39daaf1\") " pod="openshift-multus/multus-7v756" Apr 24 22:30:10.574711 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.574693 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2x9d\" (UniqueName: \"kubernetes.io/projected/f09ea6fe-aff1-4e92-a7ca-c70f50d186ec-kube-api-access-c2x9d\") pod \"iptables-alerter-fk6tq\" (UID: \"f09ea6fe-aff1-4e92-a7ca-c70f50d186ec\") " pod="openshift-network-operator/iptables-alerter-fk6tq" Apr 24 22:30:10.575693 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.575654 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qns8g\" (UniqueName: \"kubernetes.io/projected/b114ecc3-3191-4768-a2bc-d878a4044ee3-kube-api-access-qns8g\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:10.645373 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.645268 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:30:10.736413 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.736381 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" Apr 24 22:30:10.745221 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.745196 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b7xgj" Apr 24 22:30:10.757548 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.757523 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:10.762053 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.762031 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fk6tq" Apr 24 22:30:10.768517 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.768497 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:10.775140 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.775121 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-brsxm" Apr 24 22:30:10.782684 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.782663 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9lblg" Apr 24 22:30:10.788247 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.788226 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7v756" Apr 24 22:30:10.792752 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:10.792733 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mq5d5" Apr 24 22:30:11.059690 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.059657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:11.059873 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:11.059832 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:11.059942 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:11.059931 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs podName:b114ecc3-3191-4768-a2bc-d878a4044ee3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:12.059910119 +0000 UTC m=+4.051232009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs") pod "network-metrics-daemon-kdqw9" (UID: "b114ecc3-3191-4768-a2bc-d878a4044ee3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:11.150892 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:11.150721 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5665ddf8_e176_403e_874a_d3f3d5a59d2e.slice/crio-02e68f4887dd5b6a2121874cb4476be94f5d1f34503c55d4e365c989f6ed4ba0 WatchSource:0}: Error finding container 02e68f4887dd5b6a2121874cb4476be94f5d1f34503c55d4e365c989f6ed4ba0: Status 404 returned error can't find the container with id 02e68f4887dd5b6a2121874cb4476be94f5d1f34503c55d4e365c989f6ed4ba0 Apr 24 22:30:11.152290 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:11.152266 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a9aa9ab_1597_4fc6_8210_93b838855a27.slice/crio-c179bc751af77cc9a427987e94d50f74a25b94a7fb7250aabb58b7cde214a495 WatchSource:0}: Error finding container c179bc751af77cc9a427987e94d50f74a25b94a7fb7250aabb58b7cde214a495: Status 404 returned error can't find the container with id c179bc751af77cc9a427987e94d50f74a25b94a7fb7250aabb58b7cde214a495 Apr 24 22:30:11.155799 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:11.155779 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf179e6c5_8a33_48d8_96ce_1400a4dcde57.slice/crio-7f41efd02c33cf5b14cfb7ff8a7cb26a9b457582a2b9e562adc1719999202911 WatchSource:0}: Error finding container 7f41efd02c33cf5b14cfb7ff8a7cb26a9b457582a2b9e562adc1719999202911: Status 404 returned error can't find the container with id 7f41efd02c33cf5b14cfb7ff8a7cb26a9b457582a2b9e562adc1719999202911 Apr 24 22:30:11.156421 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:11.156391 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda75bb813_f2e4_4f8e_a0e9_677e2345d5f2.slice/crio-a0410f79bf8a20af1f386f0404d67dfd240d8f364c7dea0156c79f323a6b4b18 WatchSource:0}: Error finding container a0410f79bf8a20af1f386f0404d67dfd240d8f364c7dea0156c79f323a6b4b18: Status 404 returned error can't find the container with id a0410f79bf8a20af1f386f0404d67dfd240d8f364c7dea0156c79f323a6b4b18 Apr 24 22:30:11.157804 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:11.157714 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod774ca196_c906_4414_b4f1_a64f30625a6e.slice/crio-4a9676708e2bef48b8ce7074b902f9d92a60c29110f695a7a6fb03f1cde0d19e WatchSource:0}: Error finding container 4a9676708e2bef48b8ce7074b902f9d92a60c29110f695a7a6fb03f1cde0d19e: Status 404 returned error can't find the container with id 4a9676708e2bef48b8ce7074b902f9d92a60c29110f695a7a6fb03f1cde0d19e Apr 24 22:30:11.158729 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:11.158704 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf09ea6fe_aff1_4e92_a7ca_c70f50d186ec.slice/crio-5a925f91a991a70dc15d669df5c34193d6b9f7439e010bd69d232477c7741450 WatchSource:0}: Error finding container 5a925f91a991a70dc15d669df5c34193d6b9f7439e010bd69d232477c7741450: Status 404 returned error can't find the container with id 5a925f91a991a70dc15d669df5c34193d6b9f7439e010bd69d232477c7741450 Apr 24 22:30:11.159727 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:11.159701 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd413d1a6_f8ca_40a5_90ec_78dff39daaf1.slice/crio-52306b47cd984d35621abd76da51a1fcd4f0b8ea49be794f5e01fd2827793b85 WatchSource:0}: Error finding container 52306b47cd984d35621abd76da51a1fcd4f0b8ea49be794f5e01fd2827793b85: Status 404 returned error can't find the container with id 52306b47cd984d35621abd76da51a1fcd4f0b8ea49be794f5e01fd2827793b85 Apr 24 22:30:11.161572 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.160063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcpfg\" (UniqueName: \"kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg\") pod \"network-check-target-hqprm\" (UID: \"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48\") " pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:11.161572 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:11.160217 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:11.161572 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:11.160237 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:11.161572 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:11.160250 2573 projected.go:194] Error preparing data for projected volume kube-api-access-kcpfg for pod openshift-network-diagnostics/network-check-target-hqprm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:11.161572 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:11.160330 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg podName:1c5e89d8-8a8e-41eb-a725-89b55ae5ed48 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:12.160284728 +0000 UTC m=+4.151606629 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kcpfg" (UniqueName: "kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg") pod "network-check-target-hqprm" (UID: "1c5e89d8-8a8e-41eb-a725-89b55ae5ed48") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:11.161572 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:11.160689 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b1b5f17_6df7_4280_b50e_f0241d9ab7d6.slice/crio-385aa4ad3100d163da7f071923ae3a03db63890f9e00a406cedfc9ad8099c06c WatchSource:0}: Error finding container 385aa4ad3100d163da7f071923ae3a03db63890f9e00a406cedfc9ad8099c06c: Status 404 returned error can't find the container with id 385aa4ad3100d163da7f071923ae3a03db63890f9e00a406cedfc9ad8099c06c Apr 24 22:30:11.161572 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:11.161123 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadea4cfb_400a_43d8_8b2d_0cd0d88160f5.slice/crio-090bbe1a9378cb7d271b746d3c156bcecb6b3463e349d8cb383787eabe9e8750 WatchSource:0}: Error finding container 090bbe1a9378cb7d271b746d3c156bcecb6b3463e349d8cb383787eabe9e8750: Status 404 returned error can't find the container with id 090bbe1a9378cb7d271b746d3c156bcecb6b3463e349d8cb383787eabe9e8750 Apr 24 22:30:11.511337 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.510857 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:25:09 +0000 UTC" deadline="2027-10-01 01:12:07.365242225 +0000 UTC" Apr 24 22:30:11.511337 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.510919 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12578h41m55.854327524s" Apr 24 22:30:11.563798 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.563764 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:11.563982 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:11.563919 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:11.585588 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.585517 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b7xgj" event={"ID":"adea4cfb-400a-43d8-8b2d-0cd0d88160f5","Type":"ContainerStarted","Data":"090bbe1a9378cb7d271b746d3c156bcecb6b3463e349d8cb383787eabe9e8750"} Apr 24 22:30:11.592111 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.592047 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7v756" event={"ID":"d413d1a6-f8ca-40a5-90ec-78dff39daaf1","Type":"ContainerStarted","Data":"52306b47cd984d35621abd76da51a1fcd4f0b8ea49be794f5e01fd2827793b85"} Apr 24 22:30:11.597134 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.597073 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" event={"ID":"774ca196-c906-4414-b4f1-a64f30625a6e","Type":"ContainerStarted","Data":"4a9676708e2bef48b8ce7074b902f9d92a60c29110f695a7a6fb03f1cde0d19e"} Apr 24 22:30:11.606037 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.605975 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-brsxm" event={"ID":"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2","Type":"ContainerStarted","Data":"a0410f79bf8a20af1f386f0404d67dfd240d8f364c7dea0156c79f323a6b4b18"} Apr 24 22:30:11.618853 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.618809 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9lblg" event={"ID":"1a9aa9ab-1597-4fc6-8210-93b838855a27","Type":"ContainerStarted","Data":"c179bc751af77cc9a427987e94d50f74a25b94a7fb7250aabb58b7cde214a495"} Apr 24 22:30:11.620392 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.620343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tdfbz" event={"ID":"5665ddf8-e176-403e-874a-d3f3d5a59d2e","Type":"ContainerStarted","Data":"02e68f4887dd5b6a2121874cb4476be94f5d1f34503c55d4e365c989f6ed4ba0"} Apr 24 22:30:11.628032 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.627215 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-173.ec2.internal" event={"ID":"840a721be2697b1c2b72dd11b149c26f","Type":"ContainerStarted","Data":"2625a10cbdd3c9282485c39cd664768c412a66bf9e36410333a6b12ade9499b6"} Apr 24 22:30:11.639388 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.639343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5d5" event={"ID":"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6","Type":"ContainerStarted","Data":"385aa4ad3100d163da7f071923ae3a03db63890f9e00a406cedfc9ad8099c06c"} Apr 24 22:30:11.646385 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.646346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fk6tq" event={"ID":"f09ea6fe-aff1-4e92-a7ca-c70f50d186ec","Type":"ContainerStarted","Data":"5a925f91a991a70dc15d669df5c34193d6b9f7439e010bd69d232477c7741450"} Apr 24 22:30:11.647475 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.647428 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-173.ec2.internal" podStartSLOduration=1.647412982 podStartE2EDuration="1.647412982s" podCreationTimestamp="2026-04-24 22:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:11.647302346 +0000 UTC m=+3.638624248" watchObservedRunningTime="2026-04-24 22:30:11.647412982 +0000 UTC m=+3.638734884" Apr 24 22:30:11.649100 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:11.649073 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" event={"ID":"f179e6c5-8a33-48d8-96ce-1400a4dcde57","Type":"ContainerStarted","Data":"7f41efd02c33cf5b14cfb7ff8a7cb26a9b457582a2b9e562adc1719999202911"} Apr 24 22:30:12.068924 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:12.068886 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:12.069089 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:12.069064 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:12.069168 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:12.069128 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs podName:b114ecc3-3191-4768-a2bc-d878a4044ee3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.069109762 +0000 UTC m=+6.060431655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs") pod "network-metrics-daemon-kdqw9" (UID: "b114ecc3-3191-4768-a2bc-d878a4044ee3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:12.169799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:12.169722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcpfg\" (UniqueName: \"kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg\") pod \"network-check-target-hqprm\" (UID: \"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48\") " pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:12.169955 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:12.169929 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:12.170068 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:12.170010 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:12.170068 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:12.170047 2573 projected.go:194] Error preparing data for projected volume kube-api-access-kcpfg for pod openshift-network-diagnostics/network-check-target-hqprm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:12.170185 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:12.170126 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg podName:1c5e89d8-8a8e-41eb-a725-89b55ae5ed48 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.17010681 +0000 UTC m=+6.161428694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kcpfg" (UniqueName: "kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg") pod "network-check-target-hqprm" (UID: "1c5e89d8-8a8e-41eb-a725-89b55ae5ed48") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:12.566015 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:12.565977 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:12.566425 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:12.566143 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:12.672385 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:12.671305 2573 generic.go:358] "Generic (PLEG): container finished" podID="5a0fd0b02318dafd43a8edecf3cbc9ca" containerID="fbddbcc6a7f07b0730f2a641b67ef63d453fd0d7b5953b267920fbb9749f6035" exitCode=0 Apr 24 22:30:12.672385 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:12.671939 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" event={"ID":"5a0fd0b02318dafd43a8edecf3cbc9ca","Type":"ContainerDied","Data":"fbddbcc6a7f07b0730f2a641b67ef63d453fd0d7b5953b267920fbb9749f6035"} Apr 24 22:30:13.563825 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:13.563791 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:13.564012 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:13.563941 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:13.677946 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:13.677909 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" event={"ID":"5a0fd0b02318dafd43a8edecf3cbc9ca","Type":"ContainerStarted","Data":"b96345b19e7e52a2a828a90fefbcc5ee37b0f9bb11d61c3888b55d3d158e312d"} Apr 24 22:30:14.090100 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:14.089489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:14.090100 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:14.089645 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:14.090100 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:14.089707 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs podName:b114ecc3-3191-4768-a2bc-d878a4044ee3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.089690028 +0000 UTC m=+10.081011910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs") pod "network-metrics-daemon-kdqw9" (UID: "b114ecc3-3191-4768-a2bc-d878a4044ee3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:14.190520 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:14.189859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcpfg\" (UniqueName: \"kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg\") pod \"network-check-target-hqprm\" (UID: \"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48\") " pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:14.190520 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:14.190038 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:14.190520 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:14.190069 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:14.190520 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:14.190084 2573 projected.go:194] Error preparing data for projected volume kube-api-access-kcpfg for pod openshift-network-diagnostics/network-check-target-hqprm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:14.190520 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:14.190142 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg podName:1c5e89d8-8a8e-41eb-a725-89b55ae5ed48 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.190124645 +0000 UTC m=+10.181446529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kcpfg" (UniqueName: "kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg") pod "network-check-target-hqprm" (UID: "1c5e89d8-8a8e-41eb-a725-89b55ae5ed48") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:14.562922 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:14.562891 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:14.563087 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:14.563047 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:15.563424 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:15.563379 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:15.563864 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:15.563533 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:16.565218 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:16.565180 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:16.565696 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:16.565322 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:17.563279 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:17.563244 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:17.563473 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:17.563381 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:18.124418 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:18.123841 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:18.124418 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:18.124018 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:18.124418 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:18.124081 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs podName:b114ecc3-3191-4768-a2bc-d878a4044ee3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:26.124062048 +0000 UTC m=+18.115383934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs") pod "network-metrics-daemon-kdqw9" (UID: "b114ecc3-3191-4768-a2bc-d878a4044ee3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:18.225102 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:18.225059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcpfg\" (UniqueName: \"kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg\") pod \"network-check-target-hqprm\" (UID: \"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48\") " pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:18.225297 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:18.225264 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:18.225297 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:18.225283 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:18.225297 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:18.225292 2573 projected.go:194] Error preparing data for projected volume kube-api-access-kcpfg for pod openshift-network-diagnostics/network-check-target-hqprm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:18.225461 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:18.225339 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg podName:1c5e89d8-8a8e-41eb-a725-89b55ae5ed48 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:26.225325228 +0000 UTC m=+18.216647105 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kcpfg" (UniqueName: "kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg") pod "network-check-target-hqprm" (UID: "1c5e89d8-8a8e-41eb-a725-89b55ae5ed48") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:18.564090 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:18.564055 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:18.564273 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:18.564168 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:19.563628 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:19.563247 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:19.563628 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:19.563415 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:20.562857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:20.562826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:20.563037 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:20.562951 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:21.563170 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:21.563135 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:21.563636 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:21.563276 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:22.563542 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:22.563498 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:22.563991 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:22.563647 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:23.562911 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:23.562876 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:23.563105 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:23.562996 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:24.563482 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:24.563443 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:24.563903 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:24.563579 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:25.563369 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:25.563320 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:25.563529 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:25.563469 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:26.179735 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:26.179694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:26.179903 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:26.179858 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:26.179948 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:26.179929 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs podName:b114ecc3-3191-4768-a2bc-d878a4044ee3 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:42.179914953 +0000 UTC m=+34.171236832 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs") pod "network-metrics-daemon-kdqw9" (UID: "b114ecc3-3191-4768-a2bc-d878a4044ee3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:26.280046 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:26.280005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcpfg\" (UniqueName: \"kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg\") pod \"network-check-target-hqprm\" (UID: \"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48\") " pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:26.280216 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:26.280189 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:26.280290 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:26.280217 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:26.280290 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:26.280231 2573 projected.go:194] Error preparing data for projected volume kube-api-access-kcpfg for pod openshift-network-diagnostics/network-check-target-hqprm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:26.280390 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:26.280297 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg podName:1c5e89d8-8a8e-41eb-a725-89b55ae5ed48 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:42.280279187 +0000 UTC m=+34.271601069 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kcpfg" (UniqueName: "kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg") pod "network-check-target-hqprm" (UID: "1c5e89d8-8a8e-41eb-a725-89b55ae5ed48") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:26.563782 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:26.563751 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:26.564219 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:26.563847 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:27.562974 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:27.562937 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:27.563143 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:27.563069 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:28.563972 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:28.563940 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:28.564354 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:28.564025 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:29.563726 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.563304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:29.563916 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:29.563835 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:29.709373 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.709282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" event={"ID":"f179e6c5-8a33-48d8-96ce-1400a4dcde57","Type":"ContainerStarted","Data":"4868034d6edce3e9edf747bc0cb592deb2280579aa18d54c6e03523e666e43ba"} Apr 24 22:30:29.709373 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.709330 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" event={"ID":"f179e6c5-8a33-48d8-96ce-1400a4dcde57","Type":"ContainerStarted","Data":"540643a60c967322045cc28e83d179beca4d687b4c29eb8333a643de5c15ad17"} Apr 24 22:30:29.709373 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.709345 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" event={"ID":"f179e6c5-8a33-48d8-96ce-1400a4dcde57","Type":"ContainerStarted","Data":"915ad9ceae52239e504c3099f8c19d2d09e2fc59a467faa54e17aeddfd8f431c"} Apr 24 22:30:29.709373 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.709357 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" event={"ID":"f179e6c5-8a33-48d8-96ce-1400a4dcde57","Type":"ContainerStarted","Data":"ae6ec04f081134a5c25e5df83e6394a2e6fa03fd79ec860b06b0cf2f9470204a"} Apr 24 22:30:29.709373 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.709369 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" event={"ID":"f179e6c5-8a33-48d8-96ce-1400a4dcde57","Type":"ContainerStarted","Data":"414aef73494dfc298d0e3f9f6df13a27560643031bf5f4ed7635907e62b2dfa4"} Apr 24 22:30:29.710445 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.709383 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" event={"ID":"f179e6c5-8a33-48d8-96ce-1400a4dcde57","Type":"ContainerStarted","Data":"1d730da740b0fe14448de5a9fb0a2e3a45261ccfd8ae25a0f71b8052a4f02e6d"} Apr 24 22:30:29.710634 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.710610 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b7xgj" event={"ID":"adea4cfb-400a-43d8-8b2d-0cd0d88160f5","Type":"ContainerStarted","Data":"1b3169a215b0347dcd90f222d2b30d9a7f7a08c6fea7c08143f4d30e236e8c2e"} Apr 24 22:30:29.713065 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.713038 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7v756" event={"ID":"d413d1a6-f8ca-40a5-90ec-78dff39daaf1","Type":"ContainerStarted","Data":"7f209cbcc3cf28d4bca009518a61947b90a0c738fe2d43ab778c6a9bfb9c77d8"} Apr 24 22:30:29.714754 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.714729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" event={"ID":"774ca196-c906-4414-b4f1-a64f30625a6e","Type":"ContainerStarted","Data":"8e02217132f8df8b50ad587bbdc2838af6d5046362383ca8ac6020371486dca4"} Apr 24 22:30:29.716121 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.716097 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-brsxm" event={"ID":"a75bb813-f2e4-4f8e-a0e9-677e2345d5f2","Type":"ContainerStarted","Data":"38dbecb0c85c8256880a79d4efde5d786ede1e311345d8a7605000effbc88a74"} Apr 24 22:30:29.717487 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.717463 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9lblg" event={"ID":"1a9aa9ab-1597-4fc6-8210-93b838855a27","Type":"ContainerStarted","Data":"2f33cf220f5905912dcedfe112605a2517a9a2482f67148b8c1be6a188c710c4"} Apr 24 22:30:29.718928 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.718906 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tdfbz" event={"ID":"5665ddf8-e176-403e-874a-d3f3d5a59d2e","Type":"ContainerStarted","Data":"2915d7e13f20ad3eb246f4bafc15f9a6825def57c36b556e65573f643e1f42fb"} Apr 24 22:30:29.720316 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.720291 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b1b5f17-6df7-4280-b50e-f0241d9ab7d6" containerID="a0e462a0c36fd87dfa67156ba1af08c2bce9227796461e96956216632fa900ca" exitCode=0 Apr 24 22:30:29.720393 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.720328 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5d5" event={"ID":"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6","Type":"ContainerDied","Data":"a0e462a0c36fd87dfa67156ba1af08c2bce9227796461e96956216632fa900ca"} Apr 24 22:30:29.738628 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.738551 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-173.ec2.internal" podStartSLOduration=19.738535919 podStartE2EDuration="19.738535919s" podCreationTimestamp="2026-04-24 22:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:30:13.704608525 +0000 UTC m=+5.695930417" watchObservedRunningTime="2026-04-24 22:30:29.738535919 +0000 UTC m=+21.729857819" Apr 24 22:30:29.738902 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.738873 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b7xgj" podStartSLOduration=4.19450221 podStartE2EDuration="21.738866844s" podCreationTimestamp="2026-04-24 22:30:08 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.163804394 +0000 UTC m=+3.155126272" lastFinishedPulling="2026-04-24 22:30:28.70816901 +0000 UTC m=+20.699490906" observedRunningTime="2026-04-24 22:30:29.737832836 +0000 UTC m=+21.729154749" watchObservedRunningTime="2026-04-24 22:30:29.738866844 +0000 UTC m=+21.730188744" Apr 24 22:30:29.787833 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.787780 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tdfbz" podStartSLOduration=9.063868658 podStartE2EDuration="21.787764783s" podCreationTimestamp="2026-04-24 22:30:08 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.153088087 +0000 UTC m=+3.144409964" lastFinishedPulling="2026-04-24 22:30:23.876984199 +0000 UTC m=+15.868306089" observedRunningTime="2026-04-24 22:30:29.761670421 +0000 UTC m=+21.752992321" watchObservedRunningTime="2026-04-24 22:30:29.787764783 +0000 UTC m=+21.779086682" Apr 24 22:30:29.818141 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.818094 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9lblg" podStartSLOduration=4.264527103 podStartE2EDuration="21.818077187s" podCreationTimestamp="2026-04-24 22:30:08 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.154775997 +0000 UTC m=+3.146097878" lastFinishedPulling="2026-04-24 22:30:28.708326081 +0000 UTC m=+20.699647962" observedRunningTime="2026-04-24 22:30:29.788709572 +0000 UTC m=+21.780031472" watchObservedRunningTime="2026-04-24 22:30:29.818077187 +0000 UTC m=+21.809399087" Apr 24 22:30:29.843171 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.843113 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-brsxm" podStartSLOduration=4.292783803 podStartE2EDuration="21.84309433s" podCreationTimestamp="2026-04-24 22:30:08 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.158074495 +0000 UTC m=+3.149396377" lastFinishedPulling="2026-04-24 22:30:28.70838502 +0000 UTC m=+20.699706904" observedRunningTime="2026-04-24 22:30:29.818217529 +0000 UTC m=+21.809539440" watchObservedRunningTime="2026-04-24 22:30:29.84309433 +0000 UTC m=+21.834416242" Apr 24 22:30:29.843323 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.843286 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7v756" podStartSLOduration=4.256139947 podStartE2EDuration="21.843281299s" podCreationTimestamp="2026-04-24 22:30:08 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.161829092 +0000 UTC m=+3.153150973" lastFinishedPulling="2026-04-24 22:30:28.748970428 +0000 UTC m=+20.740292325" observedRunningTime="2026-04-24 22:30:29.842921808 +0000 UTC m=+21.834243738" watchObservedRunningTime="2026-04-24 22:30:29.843281299 +0000 UTC m=+21.834603198" Apr 24 22:30:29.921104 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:29.920961 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 22:30:30.066140 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:30.066101 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:30.066763 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:30.066744 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:30.517770 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:30.517669 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T22:30:29.921099679Z","UUID":"b7fa8514-c483-49bd-8108-2ea41475c8b0","Handler":null,"Name":"","Endpoint":""} Apr 24 22:30:30.520522 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:30.520496 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 22:30:30.520522 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:30.520530 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 22:30:30.563853 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:30.563820 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:30.564032 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:30.563939 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:30.724529 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:30.724486 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fk6tq" event={"ID":"f09ea6fe-aff1-4e92-a7ca-c70f50d186ec","Type":"ContainerStarted","Data":"393fb8d035a1f9274c68ba97780a797edc9d9deda4ae1ac9bba162bae082532f"} Apr 24 22:30:30.727248 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:30.727223 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" event={"ID":"774ca196-c906-4414-b4f1-a64f30625a6e","Type":"ContainerStarted","Data":"51041df791592d52c97b792ae93034e22b791fcbfd5adaa9a65999ac9109d77b"} Apr 24 22:30:30.728762 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:30.728726 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:30.729138 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:30.729119 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tdfbz" Apr 24 22:30:30.746483 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:30.746331 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fk6tq" podStartSLOduration=5.199641497 podStartE2EDuration="22.746312644s" podCreationTimestamp="2026-04-24 22:30:08 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.161496199 +0000 UTC m=+3.152818077" lastFinishedPulling="2026-04-24 22:30:28.70816733 +0000 UTC m=+20.699489224" observedRunningTime="2026-04-24 22:30:30.744663361 +0000 UTC m=+22.735985263" watchObservedRunningTime="2026-04-24 22:30:30.746312644 +0000 UTC m=+22.737634544" Apr 24 22:30:31.563113 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:31.563078 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:31.563277 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:31.563229 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:31.732037 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:31.731995 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" event={"ID":"f179e6c5-8a33-48d8-96ce-1400a4dcde57","Type":"ContainerStarted","Data":"1fe3f1034dc33c97db8b8fc6d733abeffcc7b368888f4016ce3dd752d41dd9cb"} Apr 24 22:30:31.734058 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:31.734021 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" event={"ID":"774ca196-c906-4414-b4f1-a64f30625a6e","Type":"ContainerStarted","Data":"05478d166887c253dde0c62c6c47cf71de18b43d105f690d496822e6fbd919fb"} Apr 24 22:30:31.757064 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:31.757010 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sptcd" podStartSLOduration=4.2308653960000004 podStartE2EDuration="23.756993651s" podCreationTimestamp="2026-04-24 22:30:08 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.159953073 +0000 UTC m=+3.151274963" lastFinishedPulling="2026-04-24 22:30:30.686081326 +0000 UTC m=+22.677403218" observedRunningTime="2026-04-24 22:30:31.756670177 +0000 UTC m=+23.747992074" watchObservedRunningTime="2026-04-24 22:30:31.756993651 +0000 UTC m=+23.748315555" Apr 24 22:30:32.563331 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:32.563294 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:32.563510 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:32.563421 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:33.562752 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:33.562720 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:33.563184 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:33.562829 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:33.740778 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:33.740612 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" event={"ID":"f179e6c5-8a33-48d8-96ce-1400a4dcde57","Type":"ContainerStarted","Data":"db0c9cc146f29a2b7326618111f4abe72d785f918744fe7a7e17daec954db78e"} Apr 24 22:30:33.740936 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:33.740917 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:33.741001 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:33.740949 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:33.756650 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:33.756627 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:33.807508 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:33.807443 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" podStartSLOduration=7.175002582 podStartE2EDuration="24.807421274s" podCreationTimestamp="2026-04-24 22:30:09 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.157528035 +0000 UTC m=+3.148849924" lastFinishedPulling="2026-04-24 22:30:28.789946725 +0000 UTC m=+20.781268616" observedRunningTime="2026-04-24 22:30:33.776552175 +0000 UTC m=+25.767874076" watchObservedRunningTime="2026-04-24 22:30:33.807421274 +0000 UTC m=+25.798743176" Apr 24 22:30:34.562819 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:34.562780 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:34.563279 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:34.562907 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:34.744004 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:34.743967 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b1b5f17-6df7-4280-b50e-f0241d9ab7d6" containerID="ccab986522822ebf406f53d596ba3f2190af57622142fad912d1f1354d7f2bbd" exitCode=0 Apr 24 22:30:34.744172 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:34.744058 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5d5" event={"ID":"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6","Type":"ContainerDied","Data":"ccab986522822ebf406f53d596ba3f2190af57622142fad912d1f1354d7f2bbd"} Apr 24 22:30:34.744536 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:34.744515 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:34.762648 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:34.762621 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:30:35.562941 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:35.562907 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:35.563300 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:35.563018 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:36.563256 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:36.563220 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:36.563647 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:36.563325 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:36.748997 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:36.748963 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b1b5f17-6df7-4280-b50e-f0241d9ab7d6" containerID="ded76c8bded957f69670896b570e60e81ecdb806ea356925ead30d8054baa06f" exitCode=0 Apr 24 22:30:36.749150 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:36.749014 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5d5" event={"ID":"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6","Type":"ContainerDied","Data":"ded76c8bded957f69670896b570e60e81ecdb806ea356925ead30d8054baa06f"} Apr 24 22:30:37.563270 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:37.563236 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:37.563682 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:37.563342 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:38.564306 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:38.564271 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:38.564689 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:38.564354 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:38.754223 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:38.754184 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b1b5f17-6df7-4280-b50e-f0241d9ab7d6" containerID="8a5ebff08931573f2c527b9fef5be712350392d5386dc98c9a8cc75da6e17954" exitCode=0 Apr 24 22:30:38.754394 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:38.754244 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5d5" event={"ID":"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6","Type":"ContainerDied","Data":"8a5ebff08931573f2c527b9fef5be712350392d5386dc98c9a8cc75da6e17954"} Apr 24 22:30:39.563732 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:39.563698 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:39.563901 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:39.563822 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:40.562982 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:40.562935 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:40.563452 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:40.563106 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:41.563498 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:41.563460 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:41.564038 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:41.563583 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:42.186833 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:42.186776 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:42.187047 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:42.186963 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:42.187047 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:42.187047 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs podName:b114ecc3-3191-4768-a2bc-d878a4044ee3 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:14.187024805 +0000 UTC m=+66.178346703 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs") pod "network-metrics-daemon-kdqw9" (UID: "b114ecc3-3191-4768-a2bc-d878a4044ee3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:30:42.287960 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:42.287918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcpfg\" (UniqueName: \"kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg\") pod \"network-check-target-hqprm\" (UID: \"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48\") " pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:42.288160 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:42.288089 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:30:42.288160 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:42.288114 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:30:42.288160 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:42.288127 2573 projected.go:194] Error preparing data for projected volume kube-api-access-kcpfg for pod openshift-network-diagnostics/network-check-target-hqprm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:42.288324 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:42.288193 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg podName:1c5e89d8-8a8e-41eb-a725-89b55ae5ed48 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:14.28817455 +0000 UTC m=+66.279496449 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-kcpfg" (UniqueName: "kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg") pod "network-check-target-hqprm" (UID: "1c5e89d8-8a8e-41eb-a725-89b55ae5ed48") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:30:42.563772 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:42.563733 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:42.564192 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:42.563864 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:43.563064 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:43.563021 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:43.563262 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:43.563166 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:44.008767 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:44.008730 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kdqw9"] Apr 24 22:30:44.010056 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:44.008851 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:44.010056 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:44.008967 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:44.010170 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:44.010142 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hqprm"] Apr 24 22:30:44.010219 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:44.010212 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:44.010319 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:44.010293 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:45.562964 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:45.562932 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:45.563562 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:45.562933 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:45.563562 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:45.563040 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:45.563562 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:45.563152 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:45.769292 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:45.769258 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b1b5f17-6df7-4280-b50e-f0241d9ab7d6" containerID="6c4cba67f369bff06420490621301b9bd51ef82650034a1396a24bfc28066416" exitCode=0 Apr 24 22:30:45.769455 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:45.769317 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5d5" event={"ID":"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6","Type":"ContainerDied","Data":"6c4cba67f369bff06420490621301b9bd51ef82650034a1396a24bfc28066416"} Apr 24 22:30:46.773673 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:46.773476 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b1b5f17-6df7-4280-b50e-f0241d9ab7d6" containerID="8b5ea37242f519a064bdac824d1e02b1dd86785987cb22d7087fb679639f4d17" exitCode=0 Apr 24 22:30:46.773673 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:46.773563 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5d5" event={"ID":"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6","Type":"ContainerDied","Data":"8b5ea37242f519a064bdac824d1e02b1dd86785987cb22d7087fb679639f4d17"} Apr 24 22:30:47.563371 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.563338 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:47.563548 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.563345 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:47.563548 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:47.563438 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hqprm" podUID="1c5e89d8-8a8e-41eb-a725-89b55ae5ed48" Apr 24 22:30:47.563548 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:47.563517 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kdqw9" podUID="b114ecc3-3191-4768-a2bc-d878a4044ee3" Apr 24 22:30:47.778127 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.778080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mq5d5" event={"ID":"5b1b5f17-6df7-4280-b50e-f0241d9ab7d6","Type":"ContainerStarted","Data":"20a39d2223436a8d90a0f50fec1f3b6637977b82d14cba04bcca72fdfd192fc2"} Apr 24 22:30:47.791650 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.791628 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-173.ec2.internal" event="NodeReady" Apr 24 22:30:47.791763 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.791741 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 22:30:47.827205 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.825320 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mq5d5" podStartSLOduration=6.291701747 podStartE2EDuration="39.825300653s" podCreationTimestamp="2026-04-24 22:30:08 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.16305145 +0000 UTC m=+3.154373329" lastFinishedPulling="2026-04-24 22:30:44.696650354 +0000 UTC m=+36.687972235" observedRunningTime="2026-04-24 22:30:47.82523382 +0000 UTC m=+39.816555731" watchObservedRunningTime="2026-04-24 22:30:47.825300653 +0000 UTC m=+39.816622556" Apr 24 22:30:47.911782 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.911753 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4btgp"] Apr 24 22:30:47.930162 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.930120 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rcpmc"] Apr 24 22:30:47.930323 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.930249 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:47.936560 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.936477 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hjbjm\"" Apr 24 22:30:47.936733 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.936713 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 22:30:47.936833 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.936758 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 22:30:47.957016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.956989 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rcpmc" Apr 24 22:30:47.962531 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.962510 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 22:30:47.962659 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.962548 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 22:30:47.962730 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.962669 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 22:30:47.962730 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.962676 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g8k4l\"" Apr 24 22:30:47.985436 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:47.985412 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rcpmc"] Apr 24 22:30:48.011267 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.011233 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4btgp"] Apr 24 22:30:48.030884 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.030814 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/762fb5cd-2743-4739-8c26-fe80bd1dcb02-cert\") pod \"ingress-canary-rcpmc\" (UID: \"762fb5cd-2743-4739-8c26-fe80bd1dcb02\") " pod="openshift-ingress-canary/ingress-canary-rcpmc" Apr 24 22:30:48.030884 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.030846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lhz2\" (UniqueName: \"kubernetes.io/projected/762fb5cd-2743-4739-8c26-fe80bd1dcb02-kube-api-access-7lhz2\") pod \"ingress-canary-rcpmc\" (UID: \"762fb5cd-2743-4739-8c26-fe80bd1dcb02\") " pod="openshift-ingress-canary/ingress-canary-rcpmc" Apr 24 22:30:48.030884 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.030870 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1295445-89e1-4b74-af4a-124c7863b64d-config-volume\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.031067 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.030886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clwjn\" (UniqueName: \"kubernetes.io/projected/a1295445-89e1-4b74-af4a-124c7863b64d-kube-api-access-clwjn\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.031067 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.030994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1295445-89e1-4b74-af4a-124c7863b64d-metrics-tls\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.031067 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.031038 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1295445-89e1-4b74-af4a-124c7863b64d-tmp-dir\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.032784 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.032761 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-n6js7"] Apr 24 22:30:48.053328 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.053300 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.059977 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.059958 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 22:30:48.060216 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.060202 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 22:30:48.061298 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.061285 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 22:30:48.064507 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.064492 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 22:30:48.073473 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.073455 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-ksx55\"" Apr 24 22:30:48.096070 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.096049 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n6js7"] Apr 24 22:30:48.132316 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132283 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/985f830c-629d-4d9c-aa01-fae79c3e683a-data-volume\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.132316 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132314 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/985f830c-629d-4d9c-aa01-fae79c3e683a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.132511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/762fb5cd-2743-4739-8c26-fe80bd1dcb02-cert\") pod \"ingress-canary-rcpmc\" (UID: \"762fb5cd-2743-4739-8c26-fe80bd1dcb02\") " pod="openshift-ingress-canary/ingress-canary-rcpmc" Apr 24 22:30:48.132511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lhz2\" (UniqueName: \"kubernetes.io/projected/762fb5cd-2743-4739-8c26-fe80bd1dcb02-kube-api-access-7lhz2\") pod \"ingress-canary-rcpmc\" (UID: \"762fb5cd-2743-4739-8c26-fe80bd1dcb02\") " pod="openshift-ingress-canary/ingress-canary-rcpmc" Apr 24 22:30:48.132511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1295445-89e1-4b74-af4a-124c7863b64d-config-volume\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.132511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132410 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clwjn\" (UniqueName: \"kubernetes.io/projected/a1295445-89e1-4b74-af4a-124c7863b64d-kube-api-access-clwjn\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.132511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132429 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2q4k\" (UniqueName: \"kubernetes.io/projected/985f830c-629d-4d9c-aa01-fae79c3e683a-kube-api-access-h2q4k\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.132511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132457 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/985f830c-629d-4d9c-aa01-fae79c3e683a-crio-socket\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.132511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1295445-89e1-4b74-af4a-124c7863b64d-metrics-tls\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.132511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132498 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1295445-89e1-4b74-af4a-124c7863b64d-tmp-dir\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.132842 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132537 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/985f830c-629d-4d9c-aa01-fae79c3e683a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.132877 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.132854 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a1295445-89e1-4b74-af4a-124c7863b64d-tmp-dir\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.133036 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.133018 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a1295445-89e1-4b74-af4a-124c7863b64d-config-volume\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.136471 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.136454 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1295445-89e1-4b74-af4a-124c7863b64d-metrics-tls\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.136577 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.136563 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/762fb5cd-2743-4739-8c26-fe80bd1dcb02-cert\") pod \"ingress-canary-rcpmc\" (UID: \"762fb5cd-2743-4739-8c26-fe80bd1dcb02\") " pod="openshift-ingress-canary/ingress-canary-rcpmc" Apr 24 22:30:48.150721 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.150695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clwjn\" (UniqueName: \"kubernetes.io/projected/a1295445-89e1-4b74-af4a-124c7863b64d-kube-api-access-clwjn\") pod \"dns-default-4btgp\" (UID: \"a1295445-89e1-4b74-af4a-124c7863b64d\") " pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.150829 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.150808 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lhz2\" (UniqueName: \"kubernetes.io/projected/762fb5cd-2743-4739-8c26-fe80bd1dcb02-kube-api-access-7lhz2\") pod \"ingress-canary-rcpmc\" (UID: \"762fb5cd-2743-4739-8c26-fe80bd1dcb02\") " pod="openshift-ingress-canary/ingress-canary-rcpmc" Apr 24 22:30:48.233428 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.233392 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/985f830c-629d-4d9c-aa01-fae79c3e683a-crio-socket\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.233624 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.233452 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/985f830c-629d-4d9c-aa01-fae79c3e683a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.233624 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.233493 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/985f830c-629d-4d9c-aa01-fae79c3e683a-data-volume\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.233624 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.233517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/985f830c-629d-4d9c-aa01-fae79c3e683a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.233624 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.233553 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2q4k\" (UniqueName: \"kubernetes.io/projected/985f830c-629d-4d9c-aa01-fae79c3e683a-kube-api-access-h2q4k\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.233816 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.233652 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/985f830c-629d-4d9c-aa01-fae79c3e683a-crio-socket\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.233934 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.233910 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/985f830c-629d-4d9c-aa01-fae79c3e683a-data-volume\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.234112 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.234096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/985f830c-629d-4d9c-aa01-fae79c3e683a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.236035 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.236015 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/985f830c-629d-4d9c-aa01-fae79c3e683a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.239059 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.239040 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:48.253252 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.253233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2q4k\" (UniqueName: \"kubernetes.io/projected/985f830c-629d-4d9c-aa01-fae79c3e683a-kube-api-access-h2q4k\") pod \"insights-runtime-extractor-n6js7\" (UID: \"985f830c-629d-4d9c-aa01-fae79c3e683a\") " pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.265186 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.265157 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rcpmc" Apr 24 22:30:48.361314 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.361273 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n6js7" Apr 24 22:30:48.399847 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.399817 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4btgp"] Apr 24 22:30:48.404016 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:48.403788 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1295445_89e1_4b74_af4a_124c7863b64d.slice/crio-cc2c33b24541d731ff61f8dd885efb57a9f057748f07d6c22de28a7feaec1781 WatchSource:0}: Error finding container cc2c33b24541d731ff61f8dd885efb57a9f057748f07d6c22de28a7feaec1781: Status 404 returned error can't find the container with id cc2c33b24541d731ff61f8dd885efb57a9f057748f07d6c22de28a7feaec1781 Apr 24 22:30:48.409301 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.409262 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rcpmc"] Apr 24 22:30:48.497433 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.497400 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n6js7"] Apr 24 22:30:48.500706 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:48.500678 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod985f830c_629d_4d9c_aa01_fae79c3e683a.slice/crio-2a9782474c1abc9f895ee38536aaf070b53820e0007f9e644c574c3485c77d72 WatchSource:0}: Error finding container 2a9782474c1abc9f895ee38536aaf070b53820e0007f9e644c574c3485c77d72: Status 404 returned error can't find the container with id 2a9782474c1abc9f895ee38536aaf070b53820e0007f9e644c574c3485c77d72 Apr 24 22:30:48.782294 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.782251 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n6js7" event={"ID":"985f830c-629d-4d9c-aa01-fae79c3e683a","Type":"ContainerStarted","Data":"8ceba29041f94da258c96c7172bb132bf1403fb868ccee6559707011e113e38d"} Apr 24 22:30:48.782294 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.782297 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n6js7" event={"ID":"985f830c-629d-4d9c-aa01-fae79c3e683a","Type":"ContainerStarted","Data":"2a9782474c1abc9f895ee38536aaf070b53820e0007f9e644c574c3485c77d72"} Apr 24 22:30:48.783643 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.783614 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rcpmc" event={"ID":"762fb5cd-2743-4739-8c26-fe80bd1dcb02","Type":"ContainerStarted","Data":"1b2a8807f1a8e348b118f3b074aea63eeba6f98512141afa4190c03595e34712"} Apr 24 22:30:48.784808 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.784781 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4btgp" event={"ID":"a1295445-89e1-4b74-af4a-124c7863b64d","Type":"ContainerStarted","Data":"cc2c33b24541d731ff61f8dd885efb57a9f057748f07d6c22de28a7feaec1781"} Apr 24 22:30:48.934507 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.934419 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-txprq"] Apr 24 22:30:48.939522 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.937680 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:48.942084 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.942063 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 22:30:48.942951 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.942767 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 22:30:48.943462 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.943299 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 22:30:48.944809 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.944535 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-7xhjr\"" Apr 24 22:30:48.944809 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.944622 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 22:30:48.944809 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.944538 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 22:30:48.961534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:48.961506 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-txprq"] Apr 24 22:30:49.039759 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.039722 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45abc856-83f5-4340-bf02-0178cdf538ef-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.039924 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.039777 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs579\" (UniqueName: \"kubernetes.io/projected/45abc856-83f5-4340-bf02-0178cdf538ef-kube-api-access-bs579\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.039984 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.039953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/45abc856-83f5-4340-bf02-0178cdf538ef-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.040032 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.039996 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45abc856-83f5-4340-bf02-0178cdf538ef-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.140302 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.140272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bs579\" (UniqueName: \"kubernetes.io/projected/45abc856-83f5-4340-bf02-0178cdf538ef-kube-api-access-bs579\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.140431 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.140374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/45abc856-83f5-4340-bf02-0178cdf538ef-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.140431 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.140397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45abc856-83f5-4340-bf02-0178cdf538ef-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.140431 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.140420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45abc856-83f5-4340-bf02-0178cdf538ef-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.141485 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.141200 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45abc856-83f5-4340-bf02-0178cdf538ef-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.144292 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.144267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45abc856-83f5-4340-bf02-0178cdf538ef-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.144292 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.144284 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/45abc856-83f5-4340-bf02-0178cdf538ef-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.152949 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.152903 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs579\" (UniqueName: \"kubernetes.io/projected/45abc856-83f5-4340-bf02-0178cdf538ef-kube-api-access-bs579\") pod \"prometheus-operator-5676c8c784-txprq\" (UID: \"45abc856-83f5-4340-bf02-0178cdf538ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.252862 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.252823 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" Apr 24 22:30:49.428328 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.428294 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-txprq"] Apr 24 22:30:49.563389 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.563352 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:30:49.563554 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.563364 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:30:49.569765 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.567260 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:49.569765 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.567545 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:49.569765 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.567620 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2wb7b\"" Apr 24 22:30:49.569765 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.567899 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:49.569765 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.568175 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-94q8c\"" Apr 24 22:30:49.724864 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:49.724823 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45abc856_83f5_4340_bf02_0178cdf538ef.slice/crio-04b74c90a5821351d7271afe3e2b4fa766579ccaadbe3e2be030f0d47942498b WatchSource:0}: Error finding container 04b74c90a5821351d7271afe3e2b4fa766579ccaadbe3e2be030f0d47942498b: Status 404 returned error can't find the container with id 04b74c90a5821351d7271afe3e2b4fa766579ccaadbe3e2be030f0d47942498b Apr 24 22:30:49.788455 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.788419 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n6js7" event={"ID":"985f830c-629d-4d9c-aa01-fae79c3e683a","Type":"ContainerStarted","Data":"14a9c2aa24312edcf8e919c07a899dea80e0af52991e5837005357eb66a773a9"} Apr 24 22:30:49.789580 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:49.789550 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" event={"ID":"45abc856-83f5-4340-bf02-0178cdf538ef","Type":"ContainerStarted","Data":"04b74c90a5821351d7271afe3e2b4fa766579ccaadbe3e2be030f0d47942498b"} Apr 24 22:30:50.605319 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.605029 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-557b746645-mhwgt"] Apr 24 22:30:50.608253 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.608225 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.614534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.614508 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 22:30:50.614676 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.614587 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 22:30:50.614732 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.614508 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 22:30:50.614834 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.614811 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-mqhxj\"" Apr 24 22:30:50.615728 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.615708 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 22:30:50.615728 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.615724 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 22:30:50.616742 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.616018 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 22:30:50.616742 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.616256 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 22:30:50.703003 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.702963 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557b746645-mhwgt"] Apr 24 22:30:50.759137 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.759103 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-serving-cert\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.759292 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.759146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-service-ca\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.759292 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.759189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-config\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.759292 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.759211 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-oauth-serving-cert\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.759292 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.759281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-oauth-config\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.759428 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.759309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kxmm\" (UniqueName: \"kubernetes.io/projected/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-kube-api-access-9kxmm\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.862454 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.862425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-config\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.862784 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.862476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-oauth-serving-cert\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.862784 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.862525 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-oauth-config\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.862784 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.862548 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kxmm\" (UniqueName: \"kubernetes.io/projected/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-kube-api-access-9kxmm\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.862784 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.862654 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-serving-cert\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.862784 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.862680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-service-ca\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.863320 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.863288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-service-ca\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.863889 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.863658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-config\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.864006 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.863982 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-oauth-serving-cert\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.867684 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.867653 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-oauth-config\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.868056 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.868040 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-serving-cert\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.878728 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.878706 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kxmm\" (UniqueName: \"kubernetes.io/projected/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-kube-api-access-9kxmm\") pod \"console-557b746645-mhwgt\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:50.919834 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:50.919735 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:30:51.122151 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.122120 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557b746645-mhwgt"] Apr 24 22:30:51.338137 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:51.338106 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc0e554b_018a_4aa3_9470_bd6c1bf1239d.slice/crio-6811ff406853452d83c95b18f3f4c9919e14b99c874f90de515a32dde408e8fc WatchSource:0}: Error finding container 6811ff406853452d83c95b18f3f4c9919e14b99c874f90de515a32dde408e8fc: Status 404 returned error can't find the container with id 6811ff406853452d83c95b18f3f4c9919e14b99c874f90de515a32dde408e8fc Apr 24 22:30:51.799972 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.799860 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" event={"ID":"45abc856-83f5-4340-bf02-0178cdf538ef","Type":"ContainerStarted","Data":"fd7fbb07302977be0838c6a74531eb4b0fb7736ce2d9ebd9fddfac40f6194a29"} Apr 24 22:30:51.800349 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.800311 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" event={"ID":"45abc856-83f5-4340-bf02-0178cdf538ef","Type":"ContainerStarted","Data":"cfa772b78dc5162d689a899dc672e37b39da1c63f2427edb83dd8c7544539a19"} Apr 24 22:30:51.801657 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.801622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rcpmc" event={"ID":"762fb5cd-2743-4739-8c26-fe80bd1dcb02","Type":"ContainerStarted","Data":"0b6022a23da1926b80d3aa66598e9b8e69c130c7c114193b43482bf8493fd5ee"} Apr 24 22:30:51.803614 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.803577 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n6js7" event={"ID":"985f830c-629d-4d9c-aa01-fae79c3e683a","Type":"ContainerStarted","Data":"08270c1189a1f52acb7c60449a339032c0c9ab172ee6e575e3a18d893a958a77"} Apr 24 22:30:51.805492 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.805462 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4btgp" event={"ID":"a1295445-89e1-4b74-af4a-124c7863b64d","Type":"ContainerStarted","Data":"ea256e88e2239d0f67023c4a9ba2935cead134e6c63427037636ce650877b70d"} Apr 24 22:30:51.805609 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.805494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4btgp" event={"ID":"a1295445-89e1-4b74-af4a-124c7863b64d","Type":"ContainerStarted","Data":"f68c8ee5faedfc7c6566cfe9001f1c6c7457ab35f5a3414459157a9ee729c501"} Apr 24 22:30:51.805686 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.805632 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4btgp" Apr 24 22:30:51.806658 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.806636 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557b746645-mhwgt" event={"ID":"bc0e554b-018a-4aa3-9470-bd6c1bf1239d","Type":"ContainerStarted","Data":"6811ff406853452d83c95b18f3f4c9919e14b99c874f90de515a32dde408e8fc"} Apr 24 22:30:51.834943 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.834892 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-txprq" podStartSLOduration=2.181059045 podStartE2EDuration="3.834874698s" podCreationTimestamp="2026-04-24 22:30:48 +0000 UTC" firstStartedPulling="2026-04-24 22:30:49.726740272 +0000 UTC m=+41.718062152" lastFinishedPulling="2026-04-24 22:30:51.380555906 +0000 UTC m=+43.371877805" observedRunningTime="2026-04-24 22:30:51.834099028 +0000 UTC m=+43.825420931" watchObservedRunningTime="2026-04-24 22:30:51.834874698 +0000 UTC m=+43.826196599" Apr 24 22:30:51.888623 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.888558 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rcpmc" podStartSLOduration=2.977021732 podStartE2EDuration="4.88854292s" podCreationTimestamp="2026-04-24 22:30:47 +0000 UTC" firstStartedPulling="2026-04-24 22:30:48.417543925 +0000 UTC m=+40.408865817" lastFinishedPulling="2026-04-24 22:30:50.329065125 +0000 UTC m=+42.320387005" observedRunningTime="2026-04-24 22:30:51.886325827 +0000 UTC m=+43.877647727" watchObservedRunningTime="2026-04-24 22:30:51.88854292 +0000 UTC m=+43.879864842" Apr 24 22:30:51.935086 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.935023 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-n6js7" podStartSLOduration=2.681103374 podStartE2EDuration="4.935006031s" podCreationTimestamp="2026-04-24 22:30:47 +0000 UTC" firstStartedPulling="2026-04-24 22:30:48.578589273 +0000 UTC m=+40.569911151" lastFinishedPulling="2026-04-24 22:30:50.832491914 +0000 UTC m=+42.823813808" observedRunningTime="2026-04-24 22:30:51.933557188 +0000 UTC m=+43.924879103" watchObservedRunningTime="2026-04-24 22:30:51.935006031 +0000 UTC m=+43.926327932" Apr 24 22:30:51.967361 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:51.967303 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4btgp" podStartSLOduration=3.047817264 podStartE2EDuration="4.967287541s" podCreationTimestamp="2026-04-24 22:30:47 +0000 UTC" firstStartedPulling="2026-04-24 22:30:48.405614174 +0000 UTC m=+40.396936052" lastFinishedPulling="2026-04-24 22:30:50.325084431 +0000 UTC m=+42.316406329" observedRunningTime="2026-04-24 22:30:51.966093499 +0000 UTC m=+43.957415399" watchObservedRunningTime="2026-04-24 22:30:51.967287541 +0000 UTC m=+43.958609441" Apr 24 22:30:53.668077 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.667988 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-czf56"] Apr 24 22:30:53.678322 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.678282 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.689783 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.689756 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-fh2ck\"" Apr 24 22:30:53.689919 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.689826 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 22:30:53.697313 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.697286 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 22:30:53.708540 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.708515 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-czf56"] Apr 24 22:30:53.760270 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.760244 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-d84zz"] Apr 24 22:30:53.766806 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.766782 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.788459 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.788422 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a74f7b3-39e3-4846-b087-d71b01aa3b59-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.788606 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.788474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a74f7b3-39e3-4846-b087-d71b01aa3b59-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.788606 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.788509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf2pg\" (UniqueName: \"kubernetes.io/projected/1a74f7b3-39e3-4846-b087-d71b01aa3b59-kube-api-access-gf2pg\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.788606 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.788543 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a74f7b3-39e3-4846-b087-d71b01aa3b59-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.791511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.791472 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 22:30:53.791511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.791474 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 22:30:53.797929 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.797900 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 22:30:53.802894 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.802875 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-cjn95\"" Apr 24 22:30:53.816719 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.816653 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-d84zz"] Apr 24 22:30:53.889471 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.889439 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gf2pg\" (UniqueName: \"kubernetes.io/projected/1a74f7b3-39e3-4846-b087-d71b01aa3b59-kube-api-access-gf2pg\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.889582 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.889516 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42cw4\" (UniqueName: \"kubernetes.io/projected/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-api-access-42cw4\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.889582 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.889556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.889703 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.889609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a74f7b3-39e3-4846-b087-d71b01aa3b59-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.889703 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.889639 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.889786 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.889714 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.889786 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.889750 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.889878 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.889794 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.889926 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.889886 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a74f7b3-39e3-4846-b087-d71b01aa3b59-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.889979 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.889943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a74f7b3-39e3-4846-b087-d71b01aa3b59-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.889979 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:53.889963 2573 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 22:30:53.890079 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:53.890031 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a74f7b3-39e3-4846-b087-d71b01aa3b59-openshift-state-metrics-tls podName:1a74f7b3-39e3-4846-b087-d71b01aa3b59 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:54.390015592 +0000 UTC m=+46.381337470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1a74f7b3-39e3-4846-b087-d71b01aa3b59-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-czf56" (UID: "1a74f7b3-39e3-4846-b087-d71b01aa3b59") : secret "openshift-state-metrics-tls" not found Apr 24 22:30:53.890516 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.890494 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a74f7b3-39e3-4846-b087-d71b01aa3b59-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.893358 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.893234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a74f7b3-39e3-4846-b087-d71b01aa3b59-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.928035 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.928014 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gcs4g"] Apr 24 22:30:53.929341 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.929318 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf2pg\" (UniqueName: \"kubernetes.io/projected/1a74f7b3-39e3-4846-b087-d71b01aa3b59-kube-api-access-gf2pg\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:53.931870 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.931850 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:53.935870 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.935848 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fcsn5\"" Apr 24 22:30:53.936706 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.936688 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 22:30:53.941842 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.941822 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 22:30:53.941937 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.941826 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 22:30:53.990831 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.990786 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.991032 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.990841 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.991032 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.990889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.991032 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.990971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42cw4\" (UniqueName: \"kubernetes.io/projected/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-api-access-42cw4\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.991032 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.991013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.991236 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.991044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.991481 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.991442 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.991734 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.991712 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.991802 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.991778 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.993696 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.993664 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:53.993781 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:53.993721 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:54.045976 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.045944 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42cw4\" (UniqueName: \"kubernetes.io/projected/1b8389f8-eb12-4e21-ada6-3f29e21b1aec-kube-api-access-42cw4\") pod \"kube-state-metrics-69db897b98-d84zz\" (UID: \"1b8389f8-eb12-4e21-ada6-3f29e21b1aec\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:54.076779 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.076750 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" Apr 24 22:30:54.091849 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.091815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkdwk\" (UniqueName: \"kubernetes.io/projected/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-kube-api-access-wkdwk\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.091948 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.091856 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-tls\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.091998 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.091954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-wtmp\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.091998 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.091985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-metrics-client-ca\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.092069 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.092011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-root\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.092069 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.092037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-accelerators-collector-config\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.092124 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.092088 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-sys\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.092161 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.092144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.092201 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.092184 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-textfile\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193141 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193064 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-wtmp\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193141 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193101 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-metrics-client-ca\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193141 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-root\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193159 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-accelerators-collector-config\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193186 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-sys\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193222 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-textfile\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193264 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-root\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193282 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-wtmp\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkdwk\" (UniqueName: \"kubernetes.io/projected/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-kube-api-access-wkdwk\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193285 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-sys\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-tls\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193797 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193572 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-textfile\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.193797 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.193725 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-metrics-client-ca\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.194822 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.194798 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-accelerators-collector-config\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.196267 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.196239 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.196356 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.196331 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-node-exporter-tls\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.245531 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.245496 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-d84zz"] Apr 24 22:30:54.249552 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:54.249525 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b8389f8_eb12_4e21_ada6_3f29e21b1aec.slice/crio-6424920ced4b6df1b43f33d1ba50acb565f37bd121df52da260ed60c1806f2d3 WatchSource:0}: Error finding container 6424920ced4b6df1b43f33d1ba50acb565f37bd121df52da260ed60c1806f2d3: Status 404 returned error can't find the container with id 6424920ced4b6df1b43f33d1ba50acb565f37bd121df52da260ed60c1806f2d3 Apr 24 22:30:54.250783 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.250761 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkdwk\" (UniqueName: \"kubernetes.io/projected/f4bd5f3e-34dc-4a5a-8885-40c15c8770b4-kube-api-access-wkdwk\") pod \"node-exporter-gcs4g\" (UID: \"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4\") " pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.396030 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.395991 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a74f7b3-39e3-4846-b087-d71b01aa3b59-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:54.398556 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.398532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a74f7b3-39e3-4846-b087-d71b01aa3b59-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-czf56\" (UID: \"1a74f7b3-39e3-4846-b087-d71b01aa3b59\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:54.542234 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.542199 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gcs4g" Apr 24 22:30:54.550391 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:54.550366 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4bd5f3e_34dc_4a5a_8885_40c15c8770b4.slice/crio-143317cc28c0475772bf876296ebab1f17ec35d108bb9013cc10714e8e670f27 WatchSource:0}: Error finding container 143317cc28c0475772bf876296ebab1f17ec35d108bb9013cc10714e8e670f27: Status 404 returned error can't find the container with id 143317cc28c0475772bf876296ebab1f17ec35d108bb9013cc10714e8e670f27 Apr 24 22:30:54.589821 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.589790 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" Apr 24 22:30:54.726274 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.726240 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-czf56"] Apr 24 22:30:54.730463 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:54.730434 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a74f7b3_39e3_4846_b087_d71b01aa3b59.slice/crio-3be0c0f755d42d7045f8d9634f882e010b221fcf4e8fc30079a3abbe264504c9 WatchSource:0}: Error finding container 3be0c0f755d42d7045f8d9634f882e010b221fcf4e8fc30079a3abbe264504c9: Status 404 returned error can't find the container with id 3be0c0f755d42d7045f8d9634f882e010b221fcf4e8fc30079a3abbe264504c9 Apr 24 22:30:54.825460 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.824888 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557b746645-mhwgt" event={"ID":"bc0e554b-018a-4aa3-9470-bd6c1bf1239d","Type":"ContainerStarted","Data":"eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a"} Apr 24 22:30:54.826213 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.826183 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gcs4g" event={"ID":"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4","Type":"ContainerStarted","Data":"143317cc28c0475772bf876296ebab1f17ec35d108bb9013cc10714e8e670f27"} Apr 24 22:30:54.827910 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.827883 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" event={"ID":"1a74f7b3-39e3-4846-b087-d71b01aa3b59","Type":"ContainerStarted","Data":"b1ebdd140bf68e1edf185cd6740e8bae591baa2b7261068a161c3e500cf2256f"} Apr 24 22:30:54.828013 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.827919 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" event={"ID":"1a74f7b3-39e3-4846-b087-d71b01aa3b59","Type":"ContainerStarted","Data":"3be0c0f755d42d7045f8d9634f882e010b221fcf4e8fc30079a3abbe264504c9"} Apr 24 22:30:54.828974 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.828952 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" event={"ID":"1b8389f8-eb12-4e21-ada6-3f29e21b1aec","Type":"ContainerStarted","Data":"6424920ced4b6df1b43f33d1ba50acb565f37bd121df52da260ed60c1806f2d3"} Apr 24 22:30:54.878807 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.878765 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-557b746645-mhwgt" podStartSLOduration=2.340536182 podStartE2EDuration="4.878746252s" podCreationTimestamp="2026-04-24 22:30:50 +0000 UTC" firstStartedPulling="2026-04-24 22:30:51.339974655 +0000 UTC m=+43.331296534" lastFinishedPulling="2026-04-24 22:30:53.878184713 +0000 UTC m=+45.869506604" observedRunningTime="2026-04-24 22:30:54.870244723 +0000 UTC m=+46.861566623" watchObservedRunningTime="2026-04-24 22:30:54.878746252 +0000 UTC m=+46.870068158" Apr 24 22:30:54.898417 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.896457 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:30:54.900682 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.900657 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:54.906519 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.906496 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 22:30:54.906654 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.906582 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 22:30:54.906712 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.906496 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 22:30:54.906896 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.906870 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 22:30:54.907126 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.907103 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 22:30:54.907449 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.907431 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 22:30:54.907518 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.907449 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 22:30:54.907518 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.907483 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 22:30:54.907760 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.907740 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-vlsd8\"" Apr 24 22:30:54.908157 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.908128 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 22:30:54.941383 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:54.939718 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:30:55.001039 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001004 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001201 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2zdj\" (UniqueName: \"kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-kube-api-access-s2zdj\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001201 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001201 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001201 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001177 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001400 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001255 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001400 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001305 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001400 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001331 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-config-out\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001400 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001359 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001400 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001381 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-web-config\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001699 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001405 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-config-volume\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001699 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001431 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.001699 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.001494 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102157 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102157 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102157 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102144 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-config-out\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102412 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102412 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-web-config\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102412 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-config-volume\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102412 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102412 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102300 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102412 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102412 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2zdj\" (UniqueName: \"kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-kube-api-access-s2zdj\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102752 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102752 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102752 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.102489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.102752 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:55.102687 2573 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 24 22:30:55.102752 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:30:55.102753 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-main-tls podName:a42e1f59-f356-48bb-9a07-a866264a3a12 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:55.602734827 +0000 UTC m=+47.594056710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12") : secret "alertmanager-main-tls" not found Apr 24 22:30:55.103912 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.103881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.104037 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.103957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.104094 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.104066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.106859 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.106797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.106859 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.106799 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-web-config\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.107019 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.106882 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-config-out\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.107019 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.106999 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.107567 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.107541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.108185 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.108165 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.108270 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.108172 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-config-volume\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.108387 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.108362 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.116144 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.116121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2zdj\" (UniqueName: \"kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-kube-api-access-s2zdj\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.606975 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.606816 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.610079 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.610041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.824662 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.824640 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:30:55.840665 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.838321 2573 generic.go:358] "Generic (PLEG): container finished" podID="f4bd5f3e-34dc-4a5a-8885-40c15c8770b4" containerID="2cac598ab0407850c7d25b458bc3444d07e4879db9da977361aa23542d400800" exitCode=0 Apr 24 22:30:55.840665 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.838788 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gcs4g" event={"ID":"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4","Type":"ContainerDied","Data":"2cac598ab0407850c7d25b458bc3444d07e4879db9da977361aa23542d400800"} Apr 24 22:30:55.845966 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.844881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" event={"ID":"1a74f7b3-39e3-4846-b087-d71b01aa3b59","Type":"ContainerStarted","Data":"b5444035d6f34e90331dc9dec2c4671f810d549ceea708b235682bd63733c3ba"} Apr 24 22:30:55.855539 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:55.855511 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" event={"ID":"1b8389f8-eb12-4e21-ada6-3f29e21b1aec","Type":"ContainerStarted","Data":"bd47d79b4e80648b9338a6c5ec694bf097457ecbfc260e4c5b090fa7e545235c"} Apr 24 22:30:56.021664 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:56.021467 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:30:56.025646 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:56.025347 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda42e1f59_f356_48bb_9a07_a866264a3a12.slice/crio-6b53bfe5d6e24ff289402957c68f6bd6990680143f8a665c522147e2de0bf64f WatchSource:0}: Error finding container 6b53bfe5d6e24ff289402957c68f6bd6990680143f8a665c522147e2de0bf64f: Status 404 returned error can't find the container with id 6b53bfe5d6e24ff289402957c68f6bd6990680143f8a665c522147e2de0bf64f Apr 24 22:30:56.859683 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:56.859558 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" event={"ID":"1a74f7b3-39e3-4846-b087-d71b01aa3b59","Type":"ContainerStarted","Data":"3f2983069ed250341df483d612d4afa38e0cab4ab3c37411033f9eac6d0980fb"} Apr 24 22:30:56.860734 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:56.860708 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerStarted","Data":"6b53bfe5d6e24ff289402957c68f6bd6990680143f8a665c522147e2de0bf64f"} Apr 24 22:30:56.862486 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:56.862461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" event={"ID":"1b8389f8-eb12-4e21-ada6-3f29e21b1aec","Type":"ContainerStarted","Data":"c09950ada9adbff6c2ded48b79d84f952e10334e35fbcdd810830eda7f0521ef"} Apr 24 22:30:56.862617 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:56.862493 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" event={"ID":"1b8389f8-eb12-4e21-ada6-3f29e21b1aec","Type":"ContainerStarted","Data":"d0ef42879d2cfade856b65707da3d21f1a7e6777f9d581b267c8052a0489caf2"} Apr 24 22:30:56.864135 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:56.864115 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gcs4g" event={"ID":"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4","Type":"ContainerStarted","Data":"d13e1db5e97fd1bb147e88877261c9c41d4c3c14c3ec962f68b874057680ecdf"} Apr 24 22:30:56.864231 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:56.864139 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gcs4g" event={"ID":"f4bd5f3e-34dc-4a5a-8885-40c15c8770b4","Type":"ContainerStarted","Data":"0f4204262845f62074141e43a704c54c19fed2dff8f8ae7e6f19b629a274b42b"} Apr 24 22:30:56.881013 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:56.880965 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-czf56" podStartSLOduration=2.212433907 podStartE2EDuration="3.880950558s" podCreationTimestamp="2026-04-24 22:30:53 +0000 UTC" firstStartedPulling="2026-04-24 22:30:54.865335992 +0000 UTC m=+46.856657884" lastFinishedPulling="2026-04-24 22:30:56.533852657 +0000 UTC m=+48.525174535" observedRunningTime="2026-04-24 22:30:56.879223998 +0000 UTC m=+48.870545898" watchObservedRunningTime="2026-04-24 22:30:56.880950558 +0000 UTC m=+48.872272457" Apr 24 22:30:56.912227 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:56.912173 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gcs4g" podStartSLOduration=3.196387091 podStartE2EDuration="3.912158538s" podCreationTimestamp="2026-04-24 22:30:53 +0000 UTC" firstStartedPulling="2026-04-24 22:30:54.551971312 +0000 UTC m=+46.543293189" lastFinishedPulling="2026-04-24 22:30:55.267742756 +0000 UTC m=+47.259064636" observedRunningTime="2026-04-24 22:30:56.910911314 +0000 UTC m=+48.902233307" watchObservedRunningTime="2026-04-24 22:30:56.912158538 +0000 UTC m=+48.903480437" Apr 24 22:30:56.937618 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:56.937538 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-d84zz" podStartSLOduration=2.484606432 podStartE2EDuration="3.937519227s" podCreationTimestamp="2026-04-24 22:30:53 +0000 UTC" firstStartedPulling="2026-04-24 22:30:54.251520361 +0000 UTC m=+46.242842252" lastFinishedPulling="2026-04-24 22:30:55.704433156 +0000 UTC m=+47.695755047" observedRunningTime="2026-04-24 22:30:56.936574014 +0000 UTC m=+48.927895907" watchObservedRunningTime="2026-04-24 22:30:56.937519227 +0000 UTC m=+48.928841131" Apr 24 22:30:57.868506 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:57.868474 2573 generic.go:358] "Generic (PLEG): container finished" podID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerID="eadb4408a9759d9376070d59ee6492af0cfc7c6789f7b9e2b374f7bea1927589" exitCode=0 Apr 24 22:30:57.868912 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:57.868591 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerDied","Data":"eadb4408a9759d9376070d59ee6492af0cfc7c6789f7b9e2b374f7bea1927589"} Apr 24 22:30:59.495146 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.495119 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b8b5c6c84-vpr7n"] Apr 24 22:30:59.501067 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.501047 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.509786 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.509528 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b8b5c6c84-vpr7n"] Apr 24 22:30:59.510258 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.510234 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 22:30:59.545083 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.544909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-oauth-config\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.545239 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.545112 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-config\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.545239 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.545191 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-serving-cert\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.545239 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.545212 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-service-ca\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.545365 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.545239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-trusted-ca-bundle\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.545365 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.545262 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-oauth-serving-cert\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.545365 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.545286 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b82s9\" (UniqueName: \"kubernetes.io/projected/bd4ad058-ab10-47be-b0f1-e53be535e5f3-kube-api-access-b82s9\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.646154 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.646128 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-oauth-config\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.646244 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.646178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-config\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.646244 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.646235 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-serving-cert\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.646338 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.646249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-service-ca\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.646338 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.646275 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-trusted-ca-bundle\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.646338 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.646297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-oauth-serving-cert\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.646338 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.646318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b82s9\" (UniqueName: \"kubernetes.io/projected/bd4ad058-ab10-47be-b0f1-e53be535e5f3-kube-api-access-b82s9\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.646976 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.646950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-config\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.647073 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.646953 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-service-ca\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.647562 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.647537 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-trusted-ca-bundle\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.649012 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.648992 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-serving-cert\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.649092 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.649032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-oauth-config\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.659256 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.659228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b82s9\" (UniqueName: \"kubernetes.io/projected/bd4ad058-ab10-47be-b0f1-e53be535e5f3-kube-api-access-b82s9\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.661828 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.661804 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-oauth-serving-cert\") pod \"console-7b8b5c6c84-vpr7n\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.818555 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.818522 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:30:59.880529 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.880405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerStarted","Data":"e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3"} Apr 24 22:30:59.880529 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.880447 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerStarted","Data":"1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f"} Apr 24 22:30:59.880529 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.880460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerStarted","Data":"ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13"} Apr 24 22:30:59.970470 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:30:59.968274 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b8b5c6c84-vpr7n"] Apr 24 22:30:59.971257 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:30:59.971226 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd4ad058_ab10_47be_b0f1_e53be535e5f3.slice/crio-4a271916a7c266554e3a1e719ed81a2609e575dce1c09eb5eaaa532d28268e02 WatchSource:0}: Error finding container 4a271916a7c266554e3a1e719ed81a2609e575dce1c09eb5eaaa532d28268e02: Status 404 returned error can't find the container with id 4a271916a7c266554e3a1e719ed81a2609e575dce1c09eb5eaaa532d28268e02 Apr 24 22:31:00.884557 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:00.884494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8b5c6c84-vpr7n" event={"ID":"bd4ad058-ab10-47be-b0f1-e53be535e5f3","Type":"ContainerStarted","Data":"d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f"} Apr 24 22:31:00.884557 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:00.884534 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8b5c6c84-vpr7n" event={"ID":"bd4ad058-ab10-47be-b0f1-e53be535e5f3","Type":"ContainerStarted","Data":"4a271916a7c266554e3a1e719ed81a2609e575dce1c09eb5eaaa532d28268e02"} Apr 24 22:31:00.887561 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:00.887537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerStarted","Data":"2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad"} Apr 24 22:31:00.887705 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:00.887570 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerStarted","Data":"383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6"} Apr 24 22:31:00.903861 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:00.903807 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b8b5c6c84-vpr7n" podStartSLOduration=1.903789817 podStartE2EDuration="1.903789817s" podCreationTimestamp="2026-04-24 22:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:31:00.902402743 +0000 UTC m=+52.893724645" watchObservedRunningTime="2026-04-24 22:31:00.903789817 +0000 UTC m=+52.895111716" Apr 24 22:31:00.920812 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:00.920776 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:31:00.920979 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:00.920825 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:31:00.926063 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:00.926040 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:31:01.812492 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:01.812465 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4btgp" Apr 24 22:31:01.892906 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:01.892870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerStarted","Data":"a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55"} Apr 24 22:31:01.896960 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:01.896932 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:31:01.926200 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:01.926159 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.753936064 podStartE2EDuration="7.926145744s" podCreationTimestamp="2026-04-24 22:30:54 +0000 UTC" firstStartedPulling="2026-04-24 22:30:56.028516193 +0000 UTC m=+48.019838073" lastFinishedPulling="2026-04-24 22:31:01.200725858 +0000 UTC m=+53.192047753" observedRunningTime="2026-04-24 22:31:01.924301746 +0000 UTC m=+53.915623647" watchObservedRunningTime="2026-04-24 22:31:01.926145744 +0000 UTC m=+53.917467643" Apr 24 22:31:06.766143 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:06.766108 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ckgcl" Apr 24 22:31:09.818853 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:09.818818 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:31:09.818853 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:09.818861 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:31:09.823750 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:09.823725 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:31:09.918853 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:09.918826 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:31:09.971551 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:09.971517 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-557b746645-mhwgt"] Apr 24 22:31:14.280013 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.279973 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:31:14.283062 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.283041 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:31:14.293933 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.293906 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b114ecc3-3191-4768-a2bc-d878a4044ee3-metrics-certs\") pod \"network-metrics-daemon-kdqw9\" (UID: \"b114ecc3-3191-4768-a2bc-d878a4044ee3\") " pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:31:14.380773 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.380736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcpfg\" (UniqueName: \"kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg\") pod \"network-check-target-hqprm\" (UID: \"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48\") " pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:31:14.383818 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.383794 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:31:14.395483 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.395452 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:31:14.404246 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.404222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcpfg\" (UniqueName: \"kubernetes.io/projected/1c5e89d8-8a8e-41eb-a725-89b55ae5ed48-kube-api-access-kcpfg\") pod \"network-check-target-hqprm\" (UID: \"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48\") " pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:31:14.479817 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.479785 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-94q8c\"" Apr 24 22:31:14.486920 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.486898 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2wb7b\"" Apr 24 22:31:14.487944 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.487927 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:31:14.492221 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.492206 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kdqw9" Apr 24 22:31:14.646588 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.646557 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hqprm"] Apr 24 22:31:14.649520 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:31:14.649489 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c5e89d8_8a8e_41eb_a725_89b55ae5ed48.slice/crio-d0be7f2eed2bcc55e727faf1f0451a0f9779d7ce5e3ec094582fec97be696f43 WatchSource:0}: Error finding container d0be7f2eed2bcc55e727faf1f0451a0f9779d7ce5e3ec094582fec97be696f43: Status 404 returned error can't find the container with id d0be7f2eed2bcc55e727faf1f0451a0f9779d7ce5e3ec094582fec97be696f43 Apr 24 22:31:14.665014 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.664988 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kdqw9"] Apr 24 22:31:14.667586 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:31:14.667560 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb114ecc3_3191_4768_a2bc_d878a4044ee3.slice/crio-213db8bfa732713c7ce13ca3656f098d95e84bf68aa40ed166ea43563d190a2b WatchSource:0}: Error finding container 213db8bfa732713c7ce13ca3656f098d95e84bf68aa40ed166ea43563d190a2b: Status 404 returned error can't find the container with id 213db8bfa732713c7ce13ca3656f098d95e84bf68aa40ed166ea43563d190a2b Apr 24 22:31:14.930885 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.930798 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hqprm" event={"ID":"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48","Type":"ContainerStarted","Data":"d0be7f2eed2bcc55e727faf1f0451a0f9779d7ce5e3ec094582fec97be696f43"} Apr 24 22:31:14.931874 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:14.931853 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kdqw9" event={"ID":"b114ecc3-3191-4768-a2bc-d878a4044ee3","Type":"ContainerStarted","Data":"213db8bfa732713c7ce13ca3656f098d95e84bf68aa40ed166ea43563d190a2b"} Apr 24 22:31:15.937272 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:15.937233 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kdqw9" event={"ID":"b114ecc3-3191-4768-a2bc-d878a4044ee3","Type":"ContainerStarted","Data":"0cdafddf98ac89f211b9a6fc5405cd63d55dc8c15d90f9a44dad7dcdbc90c73c"} Apr 24 22:31:16.942093 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:16.942054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kdqw9" event={"ID":"b114ecc3-3191-4768-a2bc-d878a4044ee3","Type":"ContainerStarted","Data":"6f9e42563c825a136ac26f3fe6cd0f6916967ab84e445f39511d123ccf330a38"} Apr 24 22:31:16.962095 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:16.962041 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kdqw9" podStartSLOduration=67.916853375 podStartE2EDuration="1m8.962019881s" podCreationTimestamp="2026-04-24 22:30:08 +0000 UTC" firstStartedPulling="2026-04-24 22:31:14.669525823 +0000 UTC m=+66.660847701" lastFinishedPulling="2026-04-24 22:31:15.714692314 +0000 UTC m=+67.706014207" observedRunningTime="2026-04-24 22:31:16.960711041 +0000 UTC m=+68.952032940" watchObservedRunningTime="2026-04-24 22:31:16.962019881 +0000 UTC m=+68.953341806" Apr 24 22:31:17.947153 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:17.947058 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hqprm" event={"ID":"1c5e89d8-8a8e-41eb-a725-89b55ae5ed48","Type":"ContainerStarted","Data":"214e65a04196105c5f88a2f306e470f0d22fbe0fdb94ac32dd42104fa2b4639d"} Apr 24 22:31:17.967642 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:17.967571 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hqprm" podStartSLOduration=67.124531944 podStartE2EDuration="1m9.967550805s" podCreationTimestamp="2026-04-24 22:30:08 +0000 UTC" firstStartedPulling="2026-04-24 22:31:14.651482531 +0000 UTC m=+66.642804410" lastFinishedPulling="2026-04-24 22:31:17.494501393 +0000 UTC m=+69.485823271" observedRunningTime="2026-04-24 22:31:17.965790952 +0000 UTC m=+69.957112865" watchObservedRunningTime="2026-04-24 22:31:17.967550805 +0000 UTC m=+69.958872703" Apr 24 22:31:18.950858 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:18.950743 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:31:34.993052 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:34.993008 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-557b746645-mhwgt" podUID="bc0e554b-018a-4aa3-9470-bd6c1bf1239d" containerName="console" containerID="cri-o://eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a" gracePeriod=15 Apr 24 22:31:35.228725 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.228704 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-557b746645-mhwgt_bc0e554b-018a-4aa3-9470-bd6c1bf1239d/console/0.log" Apr 24 22:31:35.228857 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.228782 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:31:35.358382 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.358285 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kxmm\" (UniqueName: \"kubernetes.io/projected/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-kube-api-access-9kxmm\") pod \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " Apr 24 22:31:35.358382 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.358319 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-serving-cert\") pod \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " Apr 24 22:31:35.358382 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.358360 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-service-ca\") pod \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " Apr 24 22:31:35.358382 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.358376 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-config\") pod \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " Apr 24 22:31:35.358733 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.358400 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-oauth-serving-cert\") pod \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " Apr 24 22:31:35.358733 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.358447 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-oauth-config\") pod \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\" (UID: \"bc0e554b-018a-4aa3-9470-bd6c1bf1239d\") " Apr 24 22:31:35.358881 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.358851 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-service-ca" (OuterVolumeSpecName: "service-ca") pod "bc0e554b-018a-4aa3-9470-bd6c1bf1239d" (UID: "bc0e554b-018a-4aa3-9470-bd6c1bf1239d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:35.358933 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.358875 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bc0e554b-018a-4aa3-9470-bd6c1bf1239d" (UID: "bc0e554b-018a-4aa3-9470-bd6c1bf1239d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:35.358933 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.358868 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-config" (OuterVolumeSpecName: "console-config") pod "bc0e554b-018a-4aa3-9470-bd6c1bf1239d" (UID: "bc0e554b-018a-4aa3-9470-bd6c1bf1239d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:31:35.360917 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.360883 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bc0e554b-018a-4aa3-9470-bd6c1bf1239d" (UID: "bc0e554b-018a-4aa3-9470-bd6c1bf1239d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:35.361029 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.360914 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bc0e554b-018a-4aa3-9470-bd6c1bf1239d" (UID: "bc0e554b-018a-4aa3-9470-bd6c1bf1239d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:31:35.361029 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.360922 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-kube-api-access-9kxmm" (OuterVolumeSpecName: "kube-api-access-9kxmm") pod "bc0e554b-018a-4aa3-9470-bd6c1bf1239d" (UID: "bc0e554b-018a-4aa3-9470-bd6c1bf1239d"). InnerVolumeSpecName "kube-api-access-9kxmm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:31:35.459801 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.459763 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-oauth-config\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:31:35.459801 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.459796 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9kxmm\" (UniqueName: \"kubernetes.io/projected/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-kube-api-access-9kxmm\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:31:35.459801 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.459805 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-serving-cert\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:31:35.460080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.459814 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-service-ca\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:31:35.460080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.459825 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-console-config\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:31:35.460080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:35.459834 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc0e554b-018a-4aa3-9470-bd6c1bf1239d-oauth-serving-cert\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:31:36.001436 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:36.001407 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-557b746645-mhwgt_bc0e554b-018a-4aa3-9470-bd6c1bf1239d/console/0.log" Apr 24 22:31:36.001895 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:36.001450 2573 generic.go:358] "Generic (PLEG): container finished" podID="bc0e554b-018a-4aa3-9470-bd6c1bf1239d" containerID="eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a" exitCode=2 Apr 24 22:31:36.001895 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:36.001512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557b746645-mhwgt" event={"ID":"bc0e554b-018a-4aa3-9470-bd6c1bf1239d","Type":"ContainerDied","Data":"eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a"} Apr 24 22:31:36.001895 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:36.001545 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557b746645-mhwgt" event={"ID":"bc0e554b-018a-4aa3-9470-bd6c1bf1239d","Type":"ContainerDied","Data":"6811ff406853452d83c95b18f3f4c9919e14b99c874f90de515a32dde408e8fc"} Apr 24 22:31:36.001895 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:36.001558 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557b746645-mhwgt" Apr 24 22:31:36.001895 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:36.001563 2573 scope.go:117] "RemoveContainer" containerID="eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a" Apr 24 22:31:36.010322 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:36.010306 2573 scope.go:117] "RemoveContainer" containerID="eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a" Apr 24 22:31:36.010680 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:31:36.010577 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a\": container with ID starting with eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a not found: ID does not exist" containerID="eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a" Apr 24 22:31:36.010680 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:36.010628 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a"} err="failed to get container status \"eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a\": rpc error: code = NotFound desc = could not find container \"eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a\": container with ID starting with eec5807a242a58367a1b1fdd4d506fe453a5025d7043bca578a9468fd9b8654a not found: ID does not exist" Apr 24 22:31:36.023018 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:36.022985 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-557b746645-mhwgt"] Apr 24 22:31:36.027480 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:36.027456 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-557b746645-mhwgt"] Apr 24 22:31:36.567257 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:36.567217 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc0e554b-018a-4aa3-9470-bd6c1bf1239d" path="/var/lib/kubelet/pods/bc0e554b-018a-4aa3-9470-bd6c1bf1239d/volumes" Apr 24 22:31:49.955873 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:31:49.955843 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hqprm" Apr 24 22:32:14.044016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:14.043979 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:32:14.044481 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:14.044431 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="alertmanager" containerID="cri-o://ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13" gracePeriod=120 Apr 24 22:32:14.044574 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:14.044483 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy" containerID="cri-o://383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6" gracePeriod=120 Apr 24 22:32:14.044574 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:14.044501 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy-web" containerID="cri-o://e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3" gracePeriod=120 Apr 24 22:32:14.044574 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:14.044527 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy-metric" containerID="cri-o://2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad" gracePeriod=120 Apr 24 22:32:14.044750 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:14.044538 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="prom-label-proxy" containerID="cri-o://a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55" gracePeriod=120 Apr 24 22:32:14.045227 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:14.044839 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="config-reloader" containerID="cri-o://1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f" gracePeriod=120 Apr 24 22:32:15.119348 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.119325 2573 generic.go:358] "Generic (PLEG): container finished" podID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerID="a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55" exitCode=0 Apr 24 22:32:15.119348 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.119346 2573 generic.go:358] "Generic (PLEG): container finished" podID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerID="2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad" exitCode=0 Apr 24 22:32:15.119756 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.119352 2573 generic.go:358] "Generic (PLEG): container finished" podID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerID="383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6" exitCode=0 Apr 24 22:32:15.119756 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.119358 2573 generic.go:358] "Generic (PLEG): container finished" podID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerID="1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f" exitCode=0 Apr 24 22:32:15.119756 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.119363 2573 generic.go:358] "Generic (PLEG): container finished" podID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerID="ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13" exitCode=0 Apr 24 22:32:15.119756 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.119390 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerDied","Data":"a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55"} Apr 24 22:32:15.119756 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.119434 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerDied","Data":"2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad"} Apr 24 22:32:15.119756 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.119450 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerDied","Data":"383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6"} Apr 24 22:32:15.119756 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.119468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerDied","Data":"1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f"} Apr 24 22:32:15.119756 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.119483 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerDied","Data":"ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13"} Apr 24 22:32:15.281346 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.281321 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:15.352712 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.352673 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-metric\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.352900 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.352722 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-config-out\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.352900 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.352755 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-main-tls\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.352900 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.352779 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-trusted-ca-bundle\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.352900 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.352809 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-metrics-client-ca\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.352900 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.352838 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-cluster-tls-config\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.352900 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.352870 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-main-db\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.352900 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.352893 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-config-volume\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.353247 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.352929 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-web-config\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.353247 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.352962 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.353247 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.352990 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-tls-assets\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.353247 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.353047 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-web\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.353247 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.353119 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2zdj\" (UniqueName: \"kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-kube-api-access-s2zdj\") pod \"a42e1f59-f356-48bb-9a07-a866264a3a12\" (UID: \"a42e1f59-f356-48bb-9a07-a866264a3a12\") " Apr 24 22:32:15.353504 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.353349 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:32:15.354157 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.354124 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:32:15.354797 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.354481 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:32:15.355997 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.355805 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-config-out" (OuterVolumeSpecName: "config-out") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:32:15.356093 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.356062 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:32:15.356258 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.356212 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:32:15.356443 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.356411 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-kube-api-access-s2zdj" (OuterVolumeSpecName: "kube-api-access-s2zdj") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "kube-api-access-s2zdj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:32:15.357037 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.357012 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:32:15.357236 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.357215 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-config-volume" (OuterVolumeSpecName: "config-volume") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:32:15.358058 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.358036 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:32:15.358227 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.358207 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:32:15.359924 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.359829 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:32:15.366140 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.366075 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-web-config" (OuterVolumeSpecName: "web-config") pod "a42e1f59-f356-48bb-9a07-a866264a3a12" (UID: "a42e1f59-f356-48bb-9a07-a866264a3a12"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:32:15.454621 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454562 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-main-db\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454621 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454619 2573 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-config-volume\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454841 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454633 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-web-config\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454841 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454646 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454841 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454658 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-tls-assets\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454841 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454683 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454841 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454695 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s2zdj\" (UniqueName: \"kubernetes.io/projected/a42e1f59-f356-48bb-9a07-a866264a3a12-kube-api-access-s2zdj\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454841 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454710 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454841 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454722 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a42e1f59-f356-48bb-9a07-a866264a3a12-config-out\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454841 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454735 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-secret-alertmanager-main-tls\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454841 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454747 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454841 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454760 2573 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a42e1f59-f356-48bb-9a07-a866264a3a12-metrics-client-ca\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:15.454841 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:15.454773 2573 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a42e1f59-f356-48bb-9a07-a866264a3a12-cluster-tls-config\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:16.124163 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.124135 2573 generic.go:358] "Generic (PLEG): container finished" podID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerID="e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3" exitCode=0 Apr 24 22:32:16.124545 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.124177 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerDied","Data":"e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3"} Apr 24 22:32:16.124545 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.124200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a42e1f59-f356-48bb-9a07-a866264a3a12","Type":"ContainerDied","Data":"6b53bfe5d6e24ff289402957c68f6bd6990680143f8a665c522147e2de0bf64f"} Apr 24 22:32:16.124545 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.124216 2573 scope.go:117] "RemoveContainer" containerID="a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55" Apr 24 22:32:16.124545 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.124245 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.131577 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.131549 2573 scope.go:117] "RemoveContainer" containerID="2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad" Apr 24 22:32:16.138209 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.138180 2573 scope.go:117] "RemoveContainer" containerID="383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6" Apr 24 22:32:16.144892 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.144874 2573 scope.go:117] "RemoveContainer" containerID="e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3" Apr 24 22:32:16.151109 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.151085 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:32:16.153166 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.153147 2573 scope.go:117] "RemoveContainer" containerID="1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f" Apr 24 22:32:16.153842 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.153818 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:32:16.159704 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.159680 2573 scope.go:117] "RemoveContainer" containerID="ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13" Apr 24 22:32:16.166087 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.165917 2573 scope.go:117] "RemoveContainer" containerID="eadb4408a9759d9376070d59ee6492af0cfc7c6789f7b9e2b374f7bea1927589" Apr 24 22:32:16.172050 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.172033 2573 scope.go:117] "RemoveContainer" containerID="a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55" Apr 24 22:32:16.172334 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:32:16.172308 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55\": container with ID starting with a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55 not found: ID does not exist" containerID="a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55" Apr 24 22:32:16.172419 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.172346 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55"} err="failed to get container status \"a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55\": rpc error: code = NotFound desc = could not find container \"a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55\": container with ID starting with a4e86a9439bc3052eb8cafd41bcea41bdf0b97cbeaec5fcdd3aa07f832a0ba55 not found: ID does not exist" Apr 24 22:32:16.172419 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.172373 2573 scope.go:117] "RemoveContainer" containerID="2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad" Apr 24 22:32:16.172690 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:32:16.172672 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad\": container with ID starting with 2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad not found: ID does not exist" containerID="2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad" Apr 24 22:32:16.172768 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.172699 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad"} err="failed to get container status \"2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad\": rpc error: code = NotFound desc = could not find container \"2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad\": container with ID starting with 2a8375c38abd648b9bce23bd4499ccd8632109c1071ff2920cb645ce895641ad not found: ID does not exist" Apr 24 22:32:16.172768 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.172723 2573 scope.go:117] "RemoveContainer" containerID="383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6" Apr 24 22:32:16.172944 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:32:16.172928 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6\": container with ID starting with 383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6 not found: ID does not exist" containerID="383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6" Apr 24 22:32:16.173003 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.172952 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6"} err="failed to get container status \"383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6\": rpc error: code = NotFound desc = could not find container \"383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6\": container with ID starting with 383b62051a8b796f7104ce1e4a0cfa4b3c90cf71e1eeb49b9fe7fd3e4bf137a6 not found: ID does not exist" Apr 24 22:32:16.173003 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.172973 2573 scope.go:117] "RemoveContainer" containerID="e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3" Apr 24 22:32:16.173187 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:32:16.173169 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3\": container with ID starting with e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3 not found: ID does not exist" containerID="e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3" Apr 24 22:32:16.173228 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.173192 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3"} err="failed to get container status \"e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3\": rpc error: code = NotFound desc = could not find container \"e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3\": container with ID starting with e93172059a8df1b8c38ae3df6cf716f11a7e77b3dcc507dd25716f95140692d3 not found: ID does not exist" Apr 24 22:32:16.173228 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.173208 2573 scope.go:117] "RemoveContainer" containerID="1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f" Apr 24 22:32:16.173403 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:32:16.173387 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f\": container with ID starting with 1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f not found: ID does not exist" containerID="1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f" Apr 24 22:32:16.173445 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.173408 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f"} err="failed to get container status \"1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f\": rpc error: code = NotFound desc = could not find container \"1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f\": container with ID starting with 1f31a4c521625b4651b1e22dc60785091ac33109a996363d3c278f6bceac639f not found: ID does not exist" Apr 24 22:32:16.173445 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.173424 2573 scope.go:117] "RemoveContainer" containerID="ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13" Apr 24 22:32:16.173580 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:32:16.173566 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13\": container with ID starting with ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13 not found: ID does not exist" containerID="ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13" Apr 24 22:32:16.173647 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.173582 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13"} err="failed to get container status \"ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13\": rpc error: code = NotFound desc = could not find container \"ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13\": container with ID starting with ae06ed4c91e20bd7a5e27bf3c56f7efece4733ed53f19ed5fd3ff288f65c4e13 not found: ID does not exist" Apr 24 22:32:16.173647 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.173622 2573 scope.go:117] "RemoveContainer" containerID="eadb4408a9759d9376070d59ee6492af0cfc7c6789f7b9e2b374f7bea1927589" Apr 24 22:32:16.173794 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:32:16.173779 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eadb4408a9759d9376070d59ee6492af0cfc7c6789f7b9e2b374f7bea1927589\": container with ID starting with eadb4408a9759d9376070d59ee6492af0cfc7c6789f7b9e2b374f7bea1927589 not found: ID does not exist" containerID="eadb4408a9759d9376070d59ee6492af0cfc7c6789f7b9e2b374f7bea1927589" Apr 24 22:32:16.173830 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.173794 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eadb4408a9759d9376070d59ee6492af0cfc7c6789f7b9e2b374f7bea1927589"} err="failed to get container status \"eadb4408a9759d9376070d59ee6492af0cfc7c6789f7b9e2b374f7bea1927589\": rpc error: code = NotFound desc = could not find container \"eadb4408a9759d9376070d59ee6492af0cfc7c6789f7b9e2b374f7bea1927589\": container with ID starting with eadb4408a9759d9376070d59ee6492af0cfc7c6789f7b9e2b374f7bea1927589 not found: ID does not exist" Apr 24 22:32:16.182567 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182543 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:32:16.182802 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182789 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc0e554b-018a-4aa3-9470-bd6c1bf1239d" containerName="console" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182803 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0e554b-018a-4aa3-9470-bd6c1bf1239d" containerName="console" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182813 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="alertmanager" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182819 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="alertmanager" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182825 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182830 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182841 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy-metric" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182846 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy-metric" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182855 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="config-reloader" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182861 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="config-reloader" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182865 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy-web" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182870 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy-web" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182876 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="prom-label-proxy" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182881 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="prom-label-proxy" Apr 24 22:32:16.182880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182887 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="init-config-reloader" Apr 24 22:32:16.183582 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182892 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="init-config-reloader" Apr 24 22:32:16.183582 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182926 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy-metric" Apr 24 22:32:16.183582 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182935 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="config-reloader" Apr 24 22:32:16.183582 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182941 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="prom-label-proxy" Apr 24 22:32:16.183582 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182947 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="alertmanager" Apr 24 22:32:16.183582 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182952 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy-web" Apr 24 22:32:16.183582 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182957 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" containerName="kube-rbac-proxy" Apr 24 22:32:16.183582 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.182963 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc0e554b-018a-4aa3-9470-bd6c1bf1239d" containerName="console" Apr 24 22:32:16.188262 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.188242 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.190997 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.190970 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 22:32:16.191094 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.191035 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 22:32:16.191250 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.191230 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 22:32:16.191375 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.191293 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 22:32:16.191375 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.191305 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-vlsd8\"" Apr 24 22:32:16.191375 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.191317 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 22:32:16.191658 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.191642 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 22:32:16.193210 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.193188 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 22:32:16.193529 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.193511 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 22:32:16.196979 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.196958 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 22:32:16.208245 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.208224 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:32:16.260575 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260540 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j8b4\" (UniqueName: \"kubernetes.io/projected/eeabef7e-b69f-4036-9683-5fa6a064923d-kube-api-access-7j8b4\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.260575 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260577 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eeabef7e-b69f-4036-9683-5fa6a064923d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.260793 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260623 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.260793 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260640 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.260793 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-web-config\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.260793 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260685 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.260793 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260747 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eeabef7e-b69f-4036-9683-5fa6a064923d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.260793 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260765 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eeabef7e-b69f-4036-9683-5fa6a064923d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.260793 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260783 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eeabef7e-b69f-4036-9683-5fa6a064923d-config-out\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.261091 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260799 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.261091 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260829 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-config-volume\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.261091 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eeabef7e-b69f-4036-9683-5fa6a064923d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.261091 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.260947 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.361330 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.361281 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7j8b4\" (UniqueName: \"kubernetes.io/projected/eeabef7e-b69f-4036-9683-5fa6a064923d-kube-api-access-7j8b4\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.361330 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.361330 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eeabef7e-b69f-4036-9683-5fa6a064923d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.361627 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.361352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.361627 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.361376 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.361627 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.361400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-web-config\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.361627 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.361516 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.361627 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.361591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eeabef7e-b69f-4036-9683-5fa6a064923d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.361876 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.361651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eeabef7e-b69f-4036-9683-5fa6a064923d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.361876 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.361675 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eeabef7e-b69f-4036-9683-5fa6a064923d-config-out\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.361876 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.361700 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.361876 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.361794 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/eeabef7e-b69f-4036-9683-5fa6a064923d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.362782 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.362752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eeabef7e-b69f-4036-9683-5fa6a064923d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.362910 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.362815 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-config-volume\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.362910 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.362844 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eeabef7e-b69f-4036-9683-5fa6a064923d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.363020 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.362910 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.365114 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.364684 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.365114 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.364780 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.365114 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.364789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-web-config\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.365114 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.364819 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/eeabef7e-b69f-4036-9683-5fa6a064923d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.365114 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.364884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.365114 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.365105 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.365458 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.365363 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/eeabef7e-b69f-4036-9683-5fa6a064923d-config-out\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.365458 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.365446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.365626 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.365586 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eeabef7e-b69f-4036-9683-5fa6a064923d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.367046 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.367028 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/eeabef7e-b69f-4036-9683-5fa6a064923d-config-volume\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.370738 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.370688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j8b4\" (UniqueName: \"kubernetes.io/projected/eeabef7e-b69f-4036-9683-5fa6a064923d-kube-api-access-7j8b4\") pod \"alertmanager-main-0\" (UID: \"eeabef7e-b69f-4036-9683-5fa6a064923d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.498342 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.498302 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 22:32:16.567805 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.567768 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42e1f59-f356-48bb-9a07-a866264a3a12" path="/var/lib/kubelet/pods/a42e1f59-f356-48bb-9a07-a866264a3a12/volumes" Apr 24 22:32:16.624475 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:16.624393 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 22:32:16.628323 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:32:16.628294 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeabef7e_b69f_4036_9683_5fa6a064923d.slice/crio-f2a3cf24ec861d563fd2dea55d6e0ee0b2a41073951abb795fe0c2497d17a35a WatchSource:0}: Error finding container f2a3cf24ec861d563fd2dea55d6e0ee0b2a41073951abb795fe0c2497d17a35a: Status 404 returned error can't find the container with id f2a3cf24ec861d563fd2dea55d6e0ee0b2a41073951abb795fe0c2497d17a35a Apr 24 22:32:17.129354 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:17.129318 2573 generic.go:358] "Generic (PLEG): container finished" podID="eeabef7e-b69f-4036-9683-5fa6a064923d" containerID="4e7e500820cbfcee0d5d518911821d68b1ee59ed5fe5d9ce30456607d83614d1" exitCode=0 Apr 24 22:32:17.129715 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:17.129405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eeabef7e-b69f-4036-9683-5fa6a064923d","Type":"ContainerDied","Data":"4e7e500820cbfcee0d5d518911821d68b1ee59ed5fe5d9ce30456607d83614d1"} Apr 24 22:32:17.129715 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:17.129440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eeabef7e-b69f-4036-9683-5fa6a064923d","Type":"ContainerStarted","Data":"f2a3cf24ec861d563fd2dea55d6e0ee0b2a41073951abb795fe0c2497d17a35a"} Apr 24 22:32:18.067541 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.067509 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-64c75dbc66-jbgq8"] Apr 24 22:32:18.070837 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.070819 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.074430 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.073650 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 22:32:18.074568 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.074435 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 22:32:18.074568 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.074481 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 22:32:18.074568 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.074481 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 22:32:18.074568 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.074518 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 22:32:18.074812 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.074576 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-7dplp\"" Apr 24 22:32:18.079912 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.079768 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 22:32:18.081398 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.081372 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-64c75dbc66-jbgq8"] Apr 24 22:32:18.135953 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.135916 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eeabef7e-b69f-4036-9683-5fa6a064923d","Type":"ContainerStarted","Data":"f3d9539fe6a64126dfe0e9616117461ee1b5940a4206d50a88076fba3e1c149c"} Apr 24 22:32:18.136394 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.135959 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eeabef7e-b69f-4036-9683-5fa6a064923d","Type":"ContainerStarted","Data":"07d5c7dd96070c4ad75f70b7d480d82031baed7e4494ca75978209b1503bf51f"} Apr 24 22:32:18.136394 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.135975 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eeabef7e-b69f-4036-9683-5fa6a064923d","Type":"ContainerStarted","Data":"afcbcf8086cc870f55ed195134409acf67761b6721e9af4018e0a856ad26377c"} Apr 24 22:32:18.136394 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.135989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eeabef7e-b69f-4036-9683-5fa6a064923d","Type":"ContainerStarted","Data":"4c3140349f94d4b471855f848f8917a2d86841149cd9c1e0571346438510c1d5"} Apr 24 22:32:18.136394 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.136000 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eeabef7e-b69f-4036-9683-5fa6a064923d","Type":"ContainerStarted","Data":"008f778b7650cb7f92af08216e4492a25602891d515ad9f348e290195a3b0ae7"} Apr 24 22:32:18.136394 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.136011 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"eeabef7e-b69f-4036-9683-5fa6a064923d","Type":"ContainerStarted","Data":"1e80e52b19c552b206c8a3761384ba932daca1f580c5360fe11976b1152d372b"} Apr 24 22:32:18.162753 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.162702 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.162684999 podStartE2EDuration="2.162684999s" podCreationTimestamp="2026-04-24 22:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:18.161344546 +0000 UTC m=+130.152666447" watchObservedRunningTime="2026-04-24 22:32:18.162684999 +0000 UTC m=+130.154006898" Apr 24 22:32:18.178439 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.178401 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-serving-certs-ca-bundle\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.178648 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.178456 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njppc\" (UniqueName: \"kubernetes.io/projected/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-kube-api-access-njppc\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.178648 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.178484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-secret-telemeter-client\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.178648 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.178513 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-telemeter-client-tls\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.178648 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.178534 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-federate-client-tls\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.178648 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.178641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-metrics-client-ca\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.178975 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.178757 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.178975 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.178788 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.279388 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.279350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-metrics-client-ca\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.279575 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.279404 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.279575 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.279525 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.279736 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.279643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-serving-certs-ca-bundle\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.279736 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.279675 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njppc\" (UniqueName: \"kubernetes.io/projected/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-kube-api-access-njppc\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.279736 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.279698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-secret-telemeter-client\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.279736 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.279732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-telemeter-client-tls\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.279936 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.279757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-federate-client-tls\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.280283 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.280242 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-metrics-client-ca\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.280569 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.280524 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.280702 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.280630 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-serving-certs-ca-bundle\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.282039 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.282011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.282226 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.282206 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-federate-client-tls\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.282335 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.282316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-secret-telemeter-client\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.282546 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.282528 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-telemeter-client-tls\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.289148 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.289125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njppc\" (UniqueName: \"kubernetes.io/projected/c7d6e71b-fb2b-4986-afbf-f73e54eca75d-kube-api-access-njppc\") pod \"telemeter-client-64c75dbc66-jbgq8\" (UID: \"c7d6e71b-fb2b-4986-afbf-f73e54eca75d\") " pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.384012 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.383923 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" Apr 24 22:32:18.538950 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:18.538925 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-64c75dbc66-jbgq8"] Apr 24 22:32:18.541822 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:32:18.541793 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d6e71b_fb2b_4986_afbf_f73e54eca75d.slice/crio-9bb66f45bd0cb795f652b03316fa452ae8ba59ea96d457202e5a227fc3b09c65 WatchSource:0}: Error finding container 9bb66f45bd0cb795f652b03316fa452ae8ba59ea96d457202e5a227fc3b09c65: Status 404 returned error can't find the container with id 9bb66f45bd0cb795f652b03316fa452ae8ba59ea96d457202e5a227fc3b09c65 Apr 24 22:32:19.141278 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:19.141243 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" event={"ID":"c7d6e71b-fb2b-4986-afbf-f73e54eca75d","Type":"ContainerStarted","Data":"9bb66f45bd0cb795f652b03316fa452ae8ba59ea96d457202e5a227fc3b09c65"} Apr 24 22:32:21.149788 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.149749 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" event={"ID":"c7d6e71b-fb2b-4986-afbf-f73e54eca75d","Type":"ContainerStarted","Data":"7a560f6569eb67a3378be2d52cee617fb3f3040db3846384d75bebf6a38bfc30"} Apr 24 22:32:21.149788 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.149790 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" event={"ID":"c7d6e71b-fb2b-4986-afbf-f73e54eca75d","Type":"ContainerStarted","Data":"e33ee0d1e176b5dbad10ec709892473ce1c68d7154b8b0311b83798f1eb55bd8"} Apr 24 22:32:21.149788 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.149799 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" event={"ID":"c7d6e71b-fb2b-4986-afbf-f73e54eca75d","Type":"ContainerStarted","Data":"a7609101b3eea432a887452bfc4d46a2e6832b3c42e4f3310e24f5fc8d0fcef9"} Apr 24 22:32:21.177542 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.177489 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-64c75dbc66-jbgq8" podStartSLOduration=1.435132927 podStartE2EDuration="3.177475979s" podCreationTimestamp="2026-04-24 22:32:18 +0000 UTC" firstStartedPulling="2026-04-24 22:32:18.543526628 +0000 UTC m=+130.534848505" lastFinishedPulling="2026-04-24 22:32:20.285869676 +0000 UTC m=+132.277191557" observedRunningTime="2026-04-24 22:32:21.173046995 +0000 UTC m=+133.164368908" watchObservedRunningTime="2026-04-24 22:32:21.177475979 +0000 UTC m=+133.168797879" Apr 24 22:32:21.820860 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.820826 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74b6d6f645-sbjrt"] Apr 24 22:32:21.824042 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.824025 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:21.832826 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.832797 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74b6d6f645-sbjrt"] Apr 24 22:32:21.910432 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.910396 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-oauth-serving-cert\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:21.910629 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.910449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-service-ca\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:21.910629 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.910524 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-trusted-ca-bundle\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:21.910629 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.910566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-oauth-config\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:21.910629 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.910584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrmhp\" (UniqueName: \"kubernetes.io/projected/7514eb02-fe4e-4a20-9c47-5af9e8231330-kube-api-access-hrmhp\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:21.910783 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.910651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-serving-cert\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:21.910783 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:21.910701 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-config\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.011630 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.011561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-trusted-ca-bundle\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.011630 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.011632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-oauth-config\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.011876 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.011658 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrmhp\" (UniqueName: \"kubernetes.io/projected/7514eb02-fe4e-4a20-9c47-5af9e8231330-kube-api-access-hrmhp\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.011876 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.011693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-serving-cert\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.011876 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.011839 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-config\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.012013 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.011899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-oauth-serving-cert\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.012013 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.011962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-service-ca\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.012538 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.012453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-trusted-ca-bundle\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.012538 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.012529 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-config\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.012788 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.012626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-oauth-serving-cert\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.012788 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.012644 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-service-ca\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.014256 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.014224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-oauth-config\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.014369 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.014352 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-serving-cert\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.025219 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.025187 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrmhp\" (UniqueName: \"kubernetes.io/projected/7514eb02-fe4e-4a20-9c47-5af9e8231330-kube-api-access-hrmhp\") pod \"console-74b6d6f645-sbjrt\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.134788 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.134694 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:22.275908 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:22.275864 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74b6d6f645-sbjrt"] Apr 24 22:32:22.278538 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:32:22.278510 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7514eb02_fe4e_4a20_9c47_5af9e8231330.slice/crio-eedd57371f4a34292a0128c1071414512cc932bcd1c13e18c43a754e1a884816 WatchSource:0}: Error finding container eedd57371f4a34292a0128c1071414512cc932bcd1c13e18c43a754e1a884816: Status 404 returned error can't find the container with id eedd57371f4a34292a0128c1071414512cc932bcd1c13e18c43a754e1a884816 Apr 24 22:32:23.157831 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:23.157795 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b6d6f645-sbjrt" event={"ID":"7514eb02-fe4e-4a20-9c47-5af9e8231330","Type":"ContainerStarted","Data":"89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7"} Apr 24 22:32:23.157831 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:23.157839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b6d6f645-sbjrt" event={"ID":"7514eb02-fe4e-4a20-9c47-5af9e8231330","Type":"ContainerStarted","Data":"eedd57371f4a34292a0128c1071414512cc932bcd1c13e18c43a754e1a884816"} Apr 24 22:32:23.178992 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:23.178939 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74b6d6f645-sbjrt" podStartSLOduration=2.178924989 podStartE2EDuration="2.178924989s" podCreationTimestamp="2026-04-24 22:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:32:23.177549784 +0000 UTC m=+135.168871682" watchObservedRunningTime="2026-04-24 22:32:23.178924989 +0000 UTC m=+135.170246889" Apr 24 22:32:32.135357 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:32.135319 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:32.135909 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:32.135421 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:32.139981 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:32.139960 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:32.187441 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:32.187415 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:32:32.234393 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:32.234359 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b8b5c6c84-vpr7n"] Apr 24 22:32:57.254439 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.254375 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b8b5c6c84-vpr7n" podUID="bd4ad058-ab10-47be-b0f1-e53be535e5f3" containerName="console" containerID="cri-o://d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f" gracePeriod=15 Apr 24 22:32:57.483703 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.483678 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b8b5c6c84-vpr7n_bd4ad058-ab10-47be-b0f1-e53be535e5f3/console/0.log" Apr 24 22:32:57.483825 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.483737 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:32:57.610939 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.610849 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b82s9\" (UniqueName: \"kubernetes.io/projected/bd4ad058-ab10-47be-b0f1-e53be535e5f3-kube-api-access-b82s9\") pod \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " Apr 24 22:32:57.610939 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.610904 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-oauth-config\") pod \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " Apr 24 22:32:57.610939 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.610932 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-oauth-serving-cert\") pod \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " Apr 24 22:32:57.611191 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.611103 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-service-ca\") pod \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " Apr 24 22:32:57.611191 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.611147 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-trusted-ca-bundle\") pod \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " Apr 24 22:32:57.611265 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.611212 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-serving-cert\") pod \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " Apr 24 22:32:57.611265 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.611240 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-config\") pod \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\" (UID: \"bd4ad058-ab10-47be-b0f1-e53be535e5f3\") " Apr 24 22:32:57.611265 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.611242 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bd4ad058-ab10-47be-b0f1-e53be535e5f3" (UID: "bd4ad058-ab10-47be-b0f1-e53be535e5f3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:32:57.611493 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.611474 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-oauth-serving-cert\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:57.611555 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.611471 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-service-ca" (OuterVolumeSpecName: "service-ca") pod "bd4ad058-ab10-47be-b0f1-e53be535e5f3" (UID: "bd4ad058-ab10-47be-b0f1-e53be535e5f3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:32:57.611679 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.611655 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bd4ad058-ab10-47be-b0f1-e53be535e5f3" (UID: "bd4ad058-ab10-47be-b0f1-e53be535e5f3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:32:57.611738 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.611690 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-config" (OuterVolumeSpecName: "console-config") pod "bd4ad058-ab10-47be-b0f1-e53be535e5f3" (UID: "bd4ad058-ab10-47be-b0f1-e53be535e5f3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:32:57.613235 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.613213 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bd4ad058-ab10-47be-b0f1-e53be535e5f3" (UID: "bd4ad058-ab10-47be-b0f1-e53be535e5f3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:32:57.613800 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.613773 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bd4ad058-ab10-47be-b0f1-e53be535e5f3" (UID: "bd4ad058-ab10-47be-b0f1-e53be535e5f3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:32:57.613897 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.613800 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4ad058-ab10-47be-b0f1-e53be535e5f3-kube-api-access-b82s9" (OuterVolumeSpecName: "kube-api-access-b82s9") pod "bd4ad058-ab10-47be-b0f1-e53be535e5f3" (UID: "bd4ad058-ab10-47be-b0f1-e53be535e5f3"). InnerVolumeSpecName "kube-api-access-b82s9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:32:57.712258 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.712208 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-service-ca\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:57.712258 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.712253 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-trusted-ca-bundle\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:57.712258 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.712265 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-serving-cert\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:57.712258 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.712274 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-config\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:57.712520 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.712285 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b82s9\" (UniqueName: \"kubernetes.io/projected/bd4ad058-ab10-47be-b0f1-e53be535e5f3-kube-api-access-b82s9\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:57.712520 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:57.712295 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd4ad058-ab10-47be-b0f1-e53be535e5f3-console-oauth-config\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:32:58.259213 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:58.259182 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b8b5c6c84-vpr7n_bd4ad058-ab10-47be-b0f1-e53be535e5f3/console/0.log" Apr 24 22:32:58.259653 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:58.259226 2573 generic.go:358] "Generic (PLEG): container finished" podID="bd4ad058-ab10-47be-b0f1-e53be535e5f3" containerID="d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f" exitCode=2 Apr 24 22:32:58.259653 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:58.259260 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8b5c6c84-vpr7n" event={"ID":"bd4ad058-ab10-47be-b0f1-e53be535e5f3","Type":"ContainerDied","Data":"d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f"} Apr 24 22:32:58.259653 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:58.259301 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8b5c6c84-vpr7n" event={"ID":"bd4ad058-ab10-47be-b0f1-e53be535e5f3","Type":"ContainerDied","Data":"4a271916a7c266554e3a1e719ed81a2609e575dce1c09eb5eaaa532d28268e02"} Apr 24 22:32:58.259653 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:58.259309 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8b5c6c84-vpr7n" Apr 24 22:32:58.259653 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:58.259316 2573 scope.go:117] "RemoveContainer" containerID="d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f" Apr 24 22:32:58.267406 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:58.267390 2573 scope.go:117] "RemoveContainer" containerID="d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f" Apr 24 22:32:58.267648 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:32:58.267629 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f\": container with ID starting with d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f not found: ID does not exist" containerID="d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f" Apr 24 22:32:58.267736 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:58.267655 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f"} err="failed to get container status \"d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f\": rpc error: code = NotFound desc = could not find container \"d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f\": container with ID starting with d5cc1785abe97b2e508e02768f1429d22d1b290542a9cd5d45ab71e91e49cf2f not found: ID does not exist" Apr 24 22:32:58.282717 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:58.282687 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b8b5c6c84-vpr7n"] Apr 24 22:32:58.284289 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:58.284270 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b8b5c6c84-vpr7n"] Apr 24 22:32:58.566947 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:32:58.566872 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4ad058-ab10-47be-b0f1-e53be535e5f3" path="/var/lib/kubelet/pods/bd4ad058-ab10-47be-b0f1-e53be535e5f3/volumes" Apr 24 22:33:43.988720 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:43.988691 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69b4f86655-v7j2c"] Apr 24 22:33:43.989188 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:43.988988 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd4ad058-ab10-47be-b0f1-e53be535e5f3" containerName="console" Apr 24 22:33:43.989188 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:43.989000 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4ad058-ab10-47be-b0f1-e53be535e5f3" containerName="console" Apr 24 22:33:43.989188 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:43.989046 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd4ad058-ab10-47be-b0f1-e53be535e5f3" containerName="console" Apr 24 22:33:43.991985 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:43.991966 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.000547 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.000523 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b4f86655-v7j2c"] Apr 24 22:33:44.043630 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.043574 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-trusted-ca-bundle\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.043630 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.043632 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-config\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.043886 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.043696 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-oauth-serving-cert\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.043886 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.043756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-oauth-config\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.043886 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.043783 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgxz7\" (UniqueName: \"kubernetes.io/projected/6e031001-588c-46cf-8201-3b36d4d7f4a9-kube-api-access-cgxz7\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.043886 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.043808 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-serving-cert\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.043886 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.043852 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-service-ca\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.144958 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.144915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-oauth-serving-cert\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.145138 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.144970 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-oauth-config\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.145138 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.145084 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgxz7\" (UniqueName: \"kubernetes.io/projected/6e031001-588c-46cf-8201-3b36d4d7f4a9-kube-api-access-cgxz7\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.145138 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.145118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-serving-cert\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.145296 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.145144 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-service-ca\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.145296 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.145194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-trusted-ca-bundle\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.145296 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.145212 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-config\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.145754 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.145733 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-oauth-serving-cert\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.145981 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.145957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-service-ca\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.146072 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.146053 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-config\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.146123 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.146106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-trusted-ca-bundle\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.147442 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.147422 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-oauth-config\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.147656 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.147640 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-serving-cert\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.152718 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.152700 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgxz7\" (UniqueName: \"kubernetes.io/projected/6e031001-588c-46cf-8201-3b36d4d7f4a9-kube-api-access-cgxz7\") pod \"console-69b4f86655-v7j2c\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.302068 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.301978 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:44.428255 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:44.424785 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b4f86655-v7j2c"] Apr 24 22:33:45.390825 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:45.390781 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b4f86655-v7j2c" event={"ID":"6e031001-588c-46cf-8201-3b36d4d7f4a9","Type":"ContainerStarted","Data":"6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183"} Apr 24 22:33:45.390825 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:45.390820 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b4f86655-v7j2c" event={"ID":"6e031001-588c-46cf-8201-3b36d4d7f4a9","Type":"ContainerStarted","Data":"9f1df0613ba3540ef548118cb56da2f0f8632bdc42337c74109b7136be66613b"} Apr 24 22:33:45.408043 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:45.407988 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69b4f86655-v7j2c" podStartSLOduration=2.407973694 podStartE2EDuration="2.407973694s" podCreationTimestamp="2026-04-24 22:33:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:33:45.407221386 +0000 UTC m=+217.398543287" watchObservedRunningTime="2026-04-24 22:33:45.407973694 +0000 UTC m=+217.399295594" Apr 24 22:33:54.303111 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:54.303070 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:54.303111 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:54.303123 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:54.308549 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:54.308517 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:54.419254 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:54.419228 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:33:54.463367 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:33:54.463328 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74b6d6f645-sbjrt"] Apr 24 22:34:08.251587 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.251555 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vp77h"] Apr 24 22:34:08.254819 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.254792 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vp77h" Apr 24 22:34:08.257608 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.257573 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:34:08.261853 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.261832 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vp77h"] Apr 24 22:34:08.341720 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.341677 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d-original-pull-secret\") pod \"global-pull-secret-syncer-vp77h\" (UID: \"69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d\") " pod="kube-system/global-pull-secret-syncer-vp77h" Apr 24 22:34:08.341720 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.341719 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d-kubelet-config\") pod \"global-pull-secret-syncer-vp77h\" (UID: \"69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d\") " pod="kube-system/global-pull-secret-syncer-vp77h" Apr 24 22:34:08.341948 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.341740 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d-dbus\") pod \"global-pull-secret-syncer-vp77h\" (UID: \"69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d\") " pod="kube-system/global-pull-secret-syncer-vp77h" Apr 24 22:34:08.442550 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.442510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d-original-pull-secret\") pod \"global-pull-secret-syncer-vp77h\" (UID: \"69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d\") " pod="kube-system/global-pull-secret-syncer-vp77h" Apr 24 22:34:08.442550 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.442546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d-kubelet-config\") pod \"global-pull-secret-syncer-vp77h\" (UID: \"69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d\") " pod="kube-system/global-pull-secret-syncer-vp77h" Apr 24 22:34:08.442759 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.442564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d-dbus\") pod \"global-pull-secret-syncer-vp77h\" (UID: \"69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d\") " pod="kube-system/global-pull-secret-syncer-vp77h" Apr 24 22:34:08.442759 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.442658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d-kubelet-config\") pod \"global-pull-secret-syncer-vp77h\" (UID: \"69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d\") " pod="kube-system/global-pull-secret-syncer-vp77h" Apr 24 22:34:08.442759 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.442711 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d-dbus\") pod \"global-pull-secret-syncer-vp77h\" (UID: \"69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d\") " pod="kube-system/global-pull-secret-syncer-vp77h" Apr 24 22:34:08.445236 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.445215 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:34:08.455138 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.455108 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d-original-pull-secret\") pod \"global-pull-secret-syncer-vp77h\" (UID: \"69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d\") " pod="kube-system/global-pull-secret-syncer-vp77h" Apr 24 22:34:08.564420 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.564336 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vp77h" Apr 24 22:34:08.683027 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:08.682994 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vp77h"] Apr 24 22:34:08.685794 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:34:08.685766 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69d0837a_7ff4_4b1d_ae6d_ab9e70350e7d.slice/crio-c1d2fae6db817829ef4764d281ed35c735ca03840a16dadd2870cccd2ed5da1e WatchSource:0}: Error finding container c1d2fae6db817829ef4764d281ed35c735ca03840a16dadd2870cccd2ed5da1e: Status 404 returned error can't find the container with id c1d2fae6db817829ef4764d281ed35c735ca03840a16dadd2870cccd2ed5da1e Apr 24 22:34:09.459077 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:09.459024 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vp77h" event={"ID":"69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d","Type":"ContainerStarted","Data":"c1d2fae6db817829ef4764d281ed35c735ca03840a16dadd2870cccd2ed5da1e"} Apr 24 22:34:12.470261 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:12.470163 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vp77h" event={"ID":"69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d","Type":"ContainerStarted","Data":"86d976a72f673f24d1642547e4635e4a1afb9fc7dcf6c8ddd9fbc79008104ead"} Apr 24 22:34:12.486684 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:12.486631 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vp77h" podStartSLOduration=0.985915334 podStartE2EDuration="4.486617573s" podCreationTimestamp="2026-04-24 22:34:08 +0000 UTC" firstStartedPulling="2026-04-24 22:34:08.68732411 +0000 UTC m=+240.678645988" lastFinishedPulling="2026-04-24 22:34:12.188026349 +0000 UTC m=+244.179348227" observedRunningTime="2026-04-24 22:34:12.485191681 +0000 UTC m=+244.476513582" watchObservedRunningTime="2026-04-24 22:34:12.486617573 +0000 UTC m=+244.477939510" Apr 24 22:34:19.485382 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.485264 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74b6d6f645-sbjrt" podUID="7514eb02-fe4e-4a20-9c47-5af9e8231330" containerName="console" containerID="cri-o://89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7" gracePeriod=15 Apr 24 22:34:19.719979 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.719953 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74b6d6f645-sbjrt_7514eb02-fe4e-4a20-9c47-5af9e8231330/console/0.log" Apr 24 22:34:19.720110 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.720016 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:34:19.836767 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.836672 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-oauth-config\") pod \"7514eb02-fe4e-4a20-9c47-5af9e8231330\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " Apr 24 22:34:19.836767 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.836753 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-config\") pod \"7514eb02-fe4e-4a20-9c47-5af9e8231330\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " Apr 24 22:34:19.836991 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.836782 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-trusted-ca-bundle\") pod \"7514eb02-fe4e-4a20-9c47-5af9e8231330\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " Apr 24 22:34:19.836991 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.836838 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-oauth-serving-cert\") pod \"7514eb02-fe4e-4a20-9c47-5af9e8231330\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " Apr 24 22:34:19.836991 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.836870 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrmhp\" (UniqueName: \"kubernetes.io/projected/7514eb02-fe4e-4a20-9c47-5af9e8231330-kube-api-access-hrmhp\") pod \"7514eb02-fe4e-4a20-9c47-5af9e8231330\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " Apr 24 22:34:19.836991 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.836893 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-serving-cert\") pod \"7514eb02-fe4e-4a20-9c47-5af9e8231330\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " Apr 24 22:34:19.836991 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.836925 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-service-ca\") pod \"7514eb02-fe4e-4a20-9c47-5af9e8231330\" (UID: \"7514eb02-fe4e-4a20-9c47-5af9e8231330\") " Apr 24 22:34:19.837241 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.837158 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-config" (OuterVolumeSpecName: "console-config") pod "7514eb02-fe4e-4a20-9c47-5af9e8231330" (UID: "7514eb02-fe4e-4a20-9c47-5af9e8231330"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:34:19.837295 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.837245 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7514eb02-fe4e-4a20-9c47-5af9e8231330" (UID: "7514eb02-fe4e-4a20-9c47-5af9e8231330"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:34:19.837377 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.837349 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7514eb02-fe4e-4a20-9c47-5af9e8231330" (UID: "7514eb02-fe4e-4a20-9c47-5af9e8231330"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:34:19.837632 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.837401 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-service-ca" (OuterVolumeSpecName: "service-ca") pod "7514eb02-fe4e-4a20-9c47-5af9e8231330" (UID: "7514eb02-fe4e-4a20-9c47-5af9e8231330"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:34:19.839196 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.839165 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7514eb02-fe4e-4a20-9c47-5af9e8231330" (UID: "7514eb02-fe4e-4a20-9c47-5af9e8231330"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:34:19.839308 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.839220 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7514eb02-fe4e-4a20-9c47-5af9e8231330-kube-api-access-hrmhp" (OuterVolumeSpecName: "kube-api-access-hrmhp") pod "7514eb02-fe4e-4a20-9c47-5af9e8231330" (UID: "7514eb02-fe4e-4a20-9c47-5af9e8231330"). InnerVolumeSpecName "kube-api-access-hrmhp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:34:19.839308 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.839229 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7514eb02-fe4e-4a20-9c47-5af9e8231330" (UID: "7514eb02-fe4e-4a20-9c47-5af9e8231330"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:34:19.937828 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.937780 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-config\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:34:19.937828 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.937820 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-trusted-ca-bundle\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:34:19.937828 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.937832 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-oauth-serving-cert\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:34:19.937828 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.937841 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hrmhp\" (UniqueName: \"kubernetes.io/projected/7514eb02-fe4e-4a20-9c47-5af9e8231330-kube-api-access-hrmhp\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:34:19.938093 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.937851 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-serving-cert\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:34:19.938093 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.937860 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7514eb02-fe4e-4a20-9c47-5af9e8231330-service-ca\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:34:19.938093 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:19.937868 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7514eb02-fe4e-4a20-9c47-5af9e8231330-console-oauth-config\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:34:20.494726 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:20.494700 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74b6d6f645-sbjrt_7514eb02-fe4e-4a20-9c47-5af9e8231330/console/0.log" Apr 24 22:34:20.495177 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:20.494740 2573 generic.go:358] "Generic (PLEG): container finished" podID="7514eb02-fe4e-4a20-9c47-5af9e8231330" containerID="89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7" exitCode=2 Apr 24 22:34:20.495177 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:20.494778 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b6d6f645-sbjrt" event={"ID":"7514eb02-fe4e-4a20-9c47-5af9e8231330","Type":"ContainerDied","Data":"89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7"} Apr 24 22:34:20.495177 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:20.494811 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b6d6f645-sbjrt" Apr 24 22:34:20.495177 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:20.494826 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b6d6f645-sbjrt" event={"ID":"7514eb02-fe4e-4a20-9c47-5af9e8231330","Type":"ContainerDied","Data":"eedd57371f4a34292a0128c1071414512cc932bcd1c13e18c43a754e1a884816"} Apr 24 22:34:20.495177 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:20.494849 2573 scope.go:117] "RemoveContainer" containerID="89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7" Apr 24 22:34:20.503136 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:20.503117 2573 scope.go:117] "RemoveContainer" containerID="89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7" Apr 24 22:34:20.503422 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:20.503405 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7\": container with ID starting with 89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7 not found: ID does not exist" containerID="89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7" Apr 24 22:34:20.503459 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:20.503431 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7"} err="failed to get container status \"89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7\": rpc error: code = NotFound desc = could not find container \"89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7\": container with ID starting with 89460c1238f9c092db565bf0eb231df938a40e4c82e537524ca693e14176c8f7 not found: ID does not exist" Apr 24 22:34:20.515178 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:20.515151 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74b6d6f645-sbjrt"] Apr 24 22:34:20.519432 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:20.519409 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74b6d6f645-sbjrt"] Apr 24 22:34:20.566883 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:20.566855 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7514eb02-fe4e-4a20-9c47-5af9e8231330" path="/var/lib/kubelet/pods/7514eb02-fe4e-4a20-9c47-5af9e8231330/volumes" Apr 24 22:34:26.502406 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.502370 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht"] Apr 24 22:34:26.502820 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.502702 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7514eb02-fe4e-4a20-9c47-5af9e8231330" containerName="console" Apr 24 22:34:26.502820 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.502713 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7514eb02-fe4e-4a20-9c47-5af9e8231330" containerName="console" Apr 24 22:34:26.502820 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.502760 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7514eb02-fe4e-4a20-9c47-5af9e8231330" containerName="console" Apr 24 22:34:26.506072 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.506054 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:26.509619 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.509572 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 22:34:26.509619 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.509580 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 22:34:26.510570 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.510554 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v5dnb\"" Apr 24 22:34:26.520494 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.520472 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht"] Apr 24 22:34:26.583465 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.583432 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:26.583664 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.583483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:26.583664 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.583510 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm78p\" (UniqueName: \"kubernetes.io/projected/b3434274-8fa9-481e-92d1-dcad4d626b4e-kube-api-access-zm78p\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:26.684030 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.683993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:26.684221 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.684055 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:26.684221 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.684081 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zm78p\" (UniqueName: \"kubernetes.io/projected/b3434274-8fa9-481e-92d1-dcad4d626b4e-kube-api-access-zm78p\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:26.684380 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.684360 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:26.684436 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.684384 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:26.692911 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.692880 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm78p\" (UniqueName: \"kubernetes.io/projected/b3434274-8fa9-481e-92d1-dcad4d626b4e-kube-api-access-zm78p\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:26.815648 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.815537 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:26.945781 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:26.945741 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht"] Apr 24 22:34:26.948613 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:34:26.948570 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3434274_8fa9_481e_92d1_dcad4d626b4e.slice/crio-00fa8922442a41bd00dbe468138fbc36cf2df0b150a56fc5430dba93cf3a11df WatchSource:0}: Error finding container 00fa8922442a41bd00dbe468138fbc36cf2df0b150a56fc5430dba93cf3a11df: Status 404 returned error can't find the container with id 00fa8922442a41bd00dbe468138fbc36cf2df0b150a56fc5430dba93cf3a11df Apr 24 22:34:27.516250 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:27.516213 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" event={"ID":"b3434274-8fa9-481e-92d1-dcad4d626b4e","Type":"ContainerStarted","Data":"00fa8922442a41bd00dbe468138fbc36cf2df0b150a56fc5430dba93cf3a11df"} Apr 24 22:34:32.534497 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:32.534457 2573 generic.go:358] "Generic (PLEG): container finished" podID="b3434274-8fa9-481e-92d1-dcad4d626b4e" containerID="3f19c4a8bfba9c26a746be51fea6ae1958c56f65bdf592561aefc53c6c71aa6d" exitCode=0 Apr 24 22:34:32.534887 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:32.534516 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" event={"ID":"b3434274-8fa9-481e-92d1-dcad4d626b4e","Type":"ContainerDied","Data":"3f19c4a8bfba9c26a746be51fea6ae1958c56f65bdf592561aefc53c6c71aa6d"} Apr 24 22:34:34.542522 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:34.542429 2573 generic.go:358] "Generic (PLEG): container finished" podID="b3434274-8fa9-481e-92d1-dcad4d626b4e" containerID="7b5e3b1dba716725c7f3e48a1f5f5c1f505040b193794a679d93e85468ae9620" exitCode=0 Apr 24 22:34:34.542522 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:34.542507 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" event={"ID":"b3434274-8fa9-481e-92d1-dcad4d626b4e","Type":"ContainerDied","Data":"7b5e3b1dba716725c7f3e48a1f5f5c1f505040b193794a679d93e85468ae9620"} Apr 24 22:34:41.567990 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:41.567947 2573 generic.go:358] "Generic (PLEG): container finished" podID="b3434274-8fa9-481e-92d1-dcad4d626b4e" containerID="7e6d9547fbf8cb56ea4cd5d3dec0e3471f2038d23013b53614da11ba6aef8e16" exitCode=0 Apr 24 22:34:41.568379 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:41.568035 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" event={"ID":"b3434274-8fa9-481e-92d1-dcad4d626b4e","Type":"ContainerDied","Data":"7e6d9547fbf8cb56ea4cd5d3dec0e3471f2038d23013b53614da11ba6aef8e16"} Apr 24 22:34:42.693858 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:42.693834 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:42.720830 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:42.720797 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm78p\" (UniqueName: \"kubernetes.io/projected/b3434274-8fa9-481e-92d1-dcad4d626b4e-kube-api-access-zm78p\") pod \"b3434274-8fa9-481e-92d1-dcad4d626b4e\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " Apr 24 22:34:42.720989 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:42.720913 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-bundle\") pod \"b3434274-8fa9-481e-92d1-dcad4d626b4e\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " Apr 24 22:34:42.720989 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:42.720942 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-util\") pod \"b3434274-8fa9-481e-92d1-dcad4d626b4e\" (UID: \"b3434274-8fa9-481e-92d1-dcad4d626b4e\") " Apr 24 22:34:42.721469 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:42.721441 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-bundle" (OuterVolumeSpecName: "bundle") pod "b3434274-8fa9-481e-92d1-dcad4d626b4e" (UID: "b3434274-8fa9-481e-92d1-dcad4d626b4e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:34:42.723261 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:42.723231 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3434274-8fa9-481e-92d1-dcad4d626b4e-kube-api-access-zm78p" (OuterVolumeSpecName: "kube-api-access-zm78p") pod "b3434274-8fa9-481e-92d1-dcad4d626b4e" (UID: "b3434274-8fa9-481e-92d1-dcad4d626b4e"). InnerVolumeSpecName "kube-api-access-zm78p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:34:42.725650 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:42.725585 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-util" (OuterVolumeSpecName: "util") pod "b3434274-8fa9-481e-92d1-dcad4d626b4e" (UID: "b3434274-8fa9-481e-92d1-dcad4d626b4e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:34:42.822279 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:42.822236 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zm78p\" (UniqueName: \"kubernetes.io/projected/b3434274-8fa9-481e-92d1-dcad4d626b4e-kube-api-access-zm78p\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:34:42.822279 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:42.822272 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-bundle\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:34:42.822279 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:42.822287 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3434274-8fa9-481e-92d1-dcad4d626b4e-util\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:34:43.575585 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:43.575555 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" event={"ID":"b3434274-8fa9-481e-92d1-dcad4d626b4e","Type":"ContainerDied","Data":"00fa8922442a41bd00dbe468138fbc36cf2df0b150a56fc5430dba93cf3a11df"} Apr 24 22:34:43.575585 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:43.575590 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00fa8922442a41bd00dbe468138fbc36cf2df0b150a56fc5430dba93cf3a11df" Apr 24 22:34:43.575803 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:43.575570 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctscht" Apr 24 22:34:48.561015 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.560977 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9"] Apr 24 22:34:48.561533 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.561363 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3434274-8fa9-481e-92d1-dcad4d626b4e" containerName="pull" Apr 24 22:34:48.561533 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.561379 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3434274-8fa9-481e-92d1-dcad4d626b4e" containerName="pull" Apr 24 22:34:48.561533 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.561405 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3434274-8fa9-481e-92d1-dcad4d626b4e" containerName="util" Apr 24 22:34:48.561533 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.561413 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3434274-8fa9-481e-92d1-dcad4d626b4e" containerName="util" Apr 24 22:34:48.561533 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.561424 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3434274-8fa9-481e-92d1-dcad4d626b4e" containerName="extract" Apr 24 22:34:48.561533 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.561433 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3434274-8fa9-481e-92d1-dcad4d626b4e" containerName="extract" Apr 24 22:34:48.561533 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.561517 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3434274-8fa9-481e-92d1-dcad4d626b4e" containerName="extract" Apr 24 22:34:48.565149 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.565128 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" Apr 24 22:34:48.567995 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.567965 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 22:34:48.568124 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.568004 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 22:34:48.568381 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.568364 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 22:34:48.568543 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.568482 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-gbdqq\"" Apr 24 22:34:48.575882 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.575857 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9"] Apr 24 22:34:48.665684 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.665639 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn6xb\" (UniqueName: \"kubernetes.io/projected/6677bb4a-2ed2-4608-b42a-3cfc6cb896ee-kube-api-access-cn6xb\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-896m9\" (UID: \"6677bb4a-2ed2-4608-b42a-3cfc6cb896ee\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" Apr 24 22:34:48.665877 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.665762 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6677bb4a-2ed2-4608-b42a-3cfc6cb896ee-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-896m9\" (UID: \"6677bb4a-2ed2-4608-b42a-3cfc6cb896ee\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" Apr 24 22:34:48.766984 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.766938 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6677bb4a-2ed2-4608-b42a-3cfc6cb896ee-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-896m9\" (UID: \"6677bb4a-2ed2-4608-b42a-3cfc6cb896ee\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" Apr 24 22:34:48.767158 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.767015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cn6xb\" (UniqueName: \"kubernetes.io/projected/6677bb4a-2ed2-4608-b42a-3cfc6cb896ee-kube-api-access-cn6xb\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-896m9\" (UID: \"6677bb4a-2ed2-4608-b42a-3cfc6cb896ee\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" Apr 24 22:34:48.769559 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.769538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/6677bb4a-2ed2-4608-b42a-3cfc6cb896ee-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-896m9\" (UID: \"6677bb4a-2ed2-4608-b42a-3cfc6cb896ee\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" Apr 24 22:34:48.776152 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.776129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn6xb\" (UniqueName: \"kubernetes.io/projected/6677bb4a-2ed2-4608-b42a-3cfc6cb896ee-kube-api-access-cn6xb\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-896m9\" (UID: \"6677bb4a-2ed2-4608-b42a-3cfc6cb896ee\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" Apr 24 22:34:48.878149 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:48.878074 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" Apr 24 22:34:49.003644 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:49.003483 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9"] Apr 24 22:34:49.007438 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:34:49.007403 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6677bb4a_2ed2_4608_b42a_3cfc6cb896ee.slice/crio-43c27d6b2a8c214466ede7994f87de704d5becd2acb93a1604fb7376e3feca5a WatchSource:0}: Error finding container 43c27d6b2a8c214466ede7994f87de704d5becd2acb93a1604fb7376e3feca5a: Status 404 returned error can't find the container with id 43c27d6b2a8c214466ede7994f87de704d5becd2acb93a1604fb7376e3feca5a Apr 24 22:34:49.595032 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:49.594993 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" event={"ID":"6677bb4a-2ed2-4608-b42a-3cfc6cb896ee","Type":"ContainerStarted","Data":"43c27d6b2a8c214466ede7994f87de704d5becd2acb93a1604fb7376e3feca5a"} Apr 24 22:34:52.605620 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.605556 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" event={"ID":"6677bb4a-2ed2-4608-b42a-3cfc6cb896ee","Type":"ContainerStarted","Data":"35a84fdd5746db4ff40da8421f79c4b09ece6fe81bbc614f5e2b48a49589d29d"} Apr 24 22:34:52.605620 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.605621 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" Apr 24 22:34:52.625162 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.625109 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" podStartSLOduration=1.463977472 podStartE2EDuration="4.625092377s" podCreationTimestamp="2026-04-24 22:34:48 +0000 UTC" firstStartedPulling="2026-04-24 22:34:49.009411349 +0000 UTC m=+281.000733241" lastFinishedPulling="2026-04-24 22:34:52.170526266 +0000 UTC m=+284.161848146" observedRunningTime="2026-04-24 22:34:52.624272606 +0000 UTC m=+284.615594504" watchObservedRunningTime="2026-04-24 22:34:52.625092377 +0000 UTC m=+284.616414277" Apr 24 22:34:52.813124 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.813093 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qtls2"] Apr 24 22:34:52.816320 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.816296 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:52.819023 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.818999 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 22:34:52.819171 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.819148 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 22:34:52.819218 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.819157 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-cxqdz\"" Apr 24 22:34:52.824130 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.824106 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qtls2"] Apr 24 22:34:52.904804 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.904713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-cabundle0\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:52.904804 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.904757 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:52.904804 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:52.904799 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwxq\" (UniqueName: \"kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-kube-api-access-mlwxq\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:53.006011 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.005973 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-cabundle0\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:53.006164 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.006027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:53.006164 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.006058 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwxq\" (UniqueName: \"kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-kube-api-access-mlwxq\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:53.006268 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.006191 2573 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 24 22:34:53.006268 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.006219 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:34:53.006268 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.006239 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:34:53.006268 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.006256 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qtls2: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 22:34:53.006383 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.006329 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates podName:94a63cd6-0372-4ed8-ab43-58d88ce1a34c nodeName:}" failed. No retries permitted until 2026-04-24 22:34:53.506305773 +0000 UTC m=+285.497627667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates") pod "keda-operator-ffbb595cb-qtls2" (UID: "94a63cd6-0372-4ed8-ab43-58d88ce1a34c") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 22:34:53.006787 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.006770 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-cabundle0\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:53.015870 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.015834 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwxq\" (UniqueName: \"kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-kube-api-access-mlwxq\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:53.181365 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.181287 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn"] Apr 24 22:34:53.184803 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.184785 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:53.188282 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.188262 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 22:34:53.198485 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.198450 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn"] Apr 24 22:34:53.309455 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.309421 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/99d56fec-4c75-40be-ba13-aeb80797f89a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:53.309621 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.309474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcmgx\" (UniqueName: \"kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-kube-api-access-tcmgx\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:53.309621 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.309550 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:53.410809 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.410772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/99d56fec-4c75-40be-ba13-aeb80797f89a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:53.410997 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.410848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcmgx\" (UniqueName: \"kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-kube-api-access-tcmgx\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:53.410997 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.410905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:53.411094 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.411002 2573 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:34:53.411094 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.411023 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:34:53.411094 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.411045 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn: references non-existent secret key: tls.crt Apr 24 22:34:53.411094 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.411089 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates podName:99d56fec-4c75-40be-ba13-aeb80797f89a nodeName:}" failed. No retries permitted until 2026-04-24 22:34:53.911074641 +0000 UTC m=+285.902396519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates") pod "keda-metrics-apiserver-7c9f485588-zdwwn" (UID: "99d56fec-4c75-40be-ba13-aeb80797f89a") : references non-existent secret key: tls.crt Apr 24 22:34:53.411250 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.411101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/99d56fec-4c75-40be-ba13-aeb80797f89a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:53.421477 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.421440 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcmgx\" (UniqueName: \"kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-kube-api-access-tcmgx\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:53.486989 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.486958 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-cpwjv"] Apr 24 22:34:53.490175 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.490152 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-cpwjv" Apr 24 22:34:53.493338 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.493316 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 22:34:53.503408 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.503388 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-cpwjv"] Apr 24 22:34:53.512043 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.512018 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:53.512210 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.512191 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:34:53.512260 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.512217 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:34:53.512260 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.512230 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qtls2: references non-existent secret key: ca.crt Apr 24 22:34:53.512333 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.512295 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates podName:94a63cd6-0372-4ed8-ab43-58d88ce1a34c nodeName:}" failed. No retries permitted until 2026-04-24 22:34:54.512274278 +0000 UTC m=+286.503596160 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates") pod "keda-operator-ffbb595cb-qtls2" (UID: "94a63cd6-0372-4ed8-ab43-58d88ce1a34c") : references non-existent secret key: ca.crt Apr 24 22:34:53.612660 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.612624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55zrn\" (UniqueName: \"kubernetes.io/projected/84c9487b-95cb-41dd-b568-ed98a41e5178-kube-api-access-55zrn\") pod \"keda-admission-cf49989db-cpwjv\" (UID: \"84c9487b-95cb-41dd-b568-ed98a41e5178\") " pod="openshift-keda/keda-admission-cf49989db-cpwjv" Apr 24 22:34:53.613074 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.612764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/84c9487b-95cb-41dd-b568-ed98a41e5178-certificates\") pod \"keda-admission-cf49989db-cpwjv\" (UID: \"84c9487b-95cb-41dd-b568-ed98a41e5178\") " pod="openshift-keda/keda-admission-cf49989db-cpwjv" Apr 24 22:34:53.713218 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.713177 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/84c9487b-95cb-41dd-b568-ed98a41e5178-certificates\") pod \"keda-admission-cf49989db-cpwjv\" (UID: \"84c9487b-95cb-41dd-b568-ed98a41e5178\") " pod="openshift-keda/keda-admission-cf49989db-cpwjv" Apr 24 22:34:53.713418 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.713263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55zrn\" (UniqueName: \"kubernetes.io/projected/84c9487b-95cb-41dd-b568-ed98a41e5178-kube-api-access-55zrn\") pod \"keda-admission-cf49989db-cpwjv\" (UID: \"84c9487b-95cb-41dd-b568-ed98a41e5178\") " pod="openshift-keda/keda-admission-cf49989db-cpwjv" Apr 24 22:34:53.718837 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.718801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/84c9487b-95cb-41dd-b568-ed98a41e5178-certificates\") pod \"keda-admission-cf49989db-cpwjv\" (UID: \"84c9487b-95cb-41dd-b568-ed98a41e5178\") " pod="openshift-keda/keda-admission-cf49989db-cpwjv" Apr 24 22:34:53.722560 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.722534 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55zrn\" (UniqueName: \"kubernetes.io/projected/84c9487b-95cb-41dd-b568-ed98a41e5178-kube-api-access-55zrn\") pod \"keda-admission-cf49989db-cpwjv\" (UID: \"84c9487b-95cb-41dd-b568-ed98a41e5178\") " pod="openshift-keda/keda-admission-cf49989db-cpwjv" Apr 24 22:34:53.800769 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.800680 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-cpwjv" Apr 24 22:34:53.917466 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.917401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:53.917657 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.917642 2573 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:34:53.917735 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.917662 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:34:53.917735 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.917685 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn: references non-existent secret key: tls.crt Apr 24 22:34:53.917843 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:53.917751 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates podName:99d56fec-4c75-40be-ba13-aeb80797f89a nodeName:}" failed. No retries permitted until 2026-04-24 22:34:54.91773174 +0000 UTC m=+286.909053627 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates") pod "keda-metrics-apiserver-7c9f485588-zdwwn" (UID: "99d56fec-4c75-40be-ba13-aeb80797f89a") : references non-existent secret key: tls.crt Apr 24 22:34:53.981410 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:53.981379 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-cpwjv"] Apr 24 22:34:53.984363 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:34:53.984338 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84c9487b_95cb_41dd_b568_ed98a41e5178.slice/crio-71cb146eb4b6707d94ef170b87aacfb0adf8332f20aee6bd5dcdcd41e4c8f866 WatchSource:0}: Error finding container 71cb146eb4b6707d94ef170b87aacfb0adf8332f20aee6bd5dcdcd41e4c8f866: Status 404 returned error can't find the container with id 71cb146eb4b6707d94ef170b87aacfb0adf8332f20aee6bd5dcdcd41e4c8f866 Apr 24 22:34:54.524000 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:54.523965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:54.524209 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:54.524142 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:34:54.524209 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:54.524163 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:34:54.524209 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:54.524176 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qtls2: references non-existent secret key: ca.crt Apr 24 22:34:54.524361 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:54.524239 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates podName:94a63cd6-0372-4ed8-ab43-58d88ce1a34c nodeName:}" failed. No retries permitted until 2026-04-24 22:34:56.524220431 +0000 UTC m=+288.515542315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates") pod "keda-operator-ffbb595cb-qtls2" (UID: "94a63cd6-0372-4ed8-ab43-58d88ce1a34c") : references non-existent secret key: ca.crt Apr 24 22:34:54.613065 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:54.613023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-cpwjv" event={"ID":"84c9487b-95cb-41dd-b568-ed98a41e5178","Type":"ContainerStarted","Data":"71cb146eb4b6707d94ef170b87aacfb0adf8332f20aee6bd5dcdcd41e4c8f866"} Apr 24 22:34:54.927383 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:54.927300 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:54.927525 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:54.927431 2573 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:34:54.927525 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:54.927447 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:34:54.927525 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:54.927466 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn: references non-existent secret key: tls.crt Apr 24 22:34:54.927525 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:54.927518 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates podName:99d56fec-4c75-40be-ba13-aeb80797f89a nodeName:}" failed. No retries permitted until 2026-04-24 22:34:56.927504458 +0000 UTC m=+288.918826337 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates") pod "keda-metrics-apiserver-7c9f485588-zdwwn" (UID: "99d56fec-4c75-40be-ba13-aeb80797f89a") : references non-existent secret key: tls.crt Apr 24 22:34:56.539620 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:56.539557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:34:56.540002 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:56.539710 2573 secret.go:281] references non-existent secret key: ca.crt Apr 24 22:34:56.540002 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:56.539727 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 22:34:56.540002 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:56.539736 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-qtls2: references non-existent secret key: ca.crt Apr 24 22:34:56.540002 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:56.539786 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates podName:94a63cd6-0372-4ed8-ab43-58d88ce1a34c nodeName:}" failed. No retries permitted until 2026-04-24 22:35:00.539772496 +0000 UTC m=+292.531094375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates") pod "keda-operator-ffbb595cb-qtls2" (UID: "94a63cd6-0372-4ed8-ab43-58d88ce1a34c") : references non-existent secret key: ca.crt Apr 24 22:34:56.620688 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:56.620623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-cpwjv" event={"ID":"84c9487b-95cb-41dd-b568-ed98a41e5178","Type":"ContainerStarted","Data":"a7cff304c8420dc457f0777fba17fba591cf790f5f9303a980fe9bdc7a9a1752"} Apr 24 22:34:56.620688 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:56.620697 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-cpwjv" Apr 24 22:34:56.639787 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:56.639740 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-cpwjv" podStartSLOduration=1.831075019 podStartE2EDuration="3.639724776s" podCreationTimestamp="2026-04-24 22:34:53 +0000 UTC" firstStartedPulling="2026-04-24 22:34:53.985786884 +0000 UTC m=+285.977108765" lastFinishedPulling="2026-04-24 22:34:55.794436644 +0000 UTC m=+287.785758522" observedRunningTime="2026-04-24 22:34:56.638380429 +0000 UTC m=+288.629702356" watchObservedRunningTime="2026-04-24 22:34:56.639724776 +0000 UTC m=+288.631046676" Apr 24 22:34:56.943672 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:34:56.943558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:34:56.943813 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:56.943720 2573 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:34:56.943813 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:56.943741 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:34:56.943813 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:56.943759 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn: references non-existent secret key: tls.crt Apr 24 22:34:56.943917 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:34:56.943821 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates podName:99d56fec-4c75-40be-ba13-aeb80797f89a nodeName:}" failed. No retries permitted until 2026-04-24 22:35:00.943805774 +0000 UTC m=+292.935127652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates") pod "keda-metrics-apiserver-7c9f485588-zdwwn" (UID: "99d56fec-4c75-40be-ba13-aeb80797f89a") : references non-existent secret key: tls.crt Apr 24 22:35:00.577026 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:00.576981 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:35:00.579527 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:00.579498 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/94a63cd6-0372-4ed8-ab43-58d88ce1a34c-certificates\") pod \"keda-operator-ffbb595cb-qtls2\" (UID: \"94a63cd6-0372-4ed8-ab43-58d88ce1a34c\") " pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:35:00.627561 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:00.627524 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:35:00.746076 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:00.746030 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-qtls2"] Apr 24 22:35:00.748806 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:35:00.748777 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a63cd6_0372_4ed8_ab43_58d88ce1a34c.slice/crio-cdb3f097527741d4413ca4b2019d1bd37635c35f2352e2beb22132c5f0f2b682 WatchSource:0}: Error finding container cdb3f097527741d4413ca4b2019d1bd37635c35f2352e2beb22132c5f0f2b682: Status 404 returned error can't find the container with id cdb3f097527741d4413ca4b2019d1bd37635c35f2352e2beb22132c5f0f2b682 Apr 24 22:35:00.980938 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:00.980912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:35:00.983510 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:00.983489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99d56fec-4c75-40be-ba13-aeb80797f89a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-zdwwn\" (UID: \"99d56fec-4c75-40be-ba13-aeb80797f89a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:35:00.995331 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:00.995300 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:35:01.122081 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:01.121986 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn"] Apr 24 22:35:01.124559 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:35:01.124532 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99d56fec_4c75_40be_ba13_aeb80797f89a.slice/crio-dd5bb6127ab3ac87ba3adeea837628a108c3ae0c90e799bb8ac8f094fde26a55 WatchSource:0}: Error finding container dd5bb6127ab3ac87ba3adeea837628a108c3ae0c90e799bb8ac8f094fde26a55: Status 404 returned error can't find the container with id dd5bb6127ab3ac87ba3adeea837628a108c3ae0c90e799bb8ac8f094fde26a55 Apr 24 22:35:01.635559 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:01.635514 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" event={"ID":"99d56fec-4c75-40be-ba13-aeb80797f89a","Type":"ContainerStarted","Data":"dd5bb6127ab3ac87ba3adeea837628a108c3ae0c90e799bb8ac8f094fde26a55"} Apr 24 22:35:01.636870 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:01.636832 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-qtls2" event={"ID":"94a63cd6-0372-4ed8-ab43-58d88ce1a34c","Type":"ContainerStarted","Data":"cdb3f097527741d4413ca4b2019d1bd37635c35f2352e2beb22132c5f0f2b682"} Apr 24 22:35:04.649250 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:04.649220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" event={"ID":"99d56fec-4c75-40be-ba13-aeb80797f89a","Type":"ContainerStarted","Data":"c7793f34a35d9ccce802bc6e58246c4d2bb2eb752264133953080c6005ef96a7"} Apr 24 22:35:04.649721 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:04.649344 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:35:04.650651 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:04.650629 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-qtls2" event={"ID":"94a63cd6-0372-4ed8-ab43-58d88ce1a34c","Type":"ContainerStarted","Data":"68b79e1c6dc8bf3e3a50d7638ec862149712b965929290f3f1c1c88cfc261255"} Apr 24 22:35:04.650782 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:04.650767 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:35:04.667261 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:04.667187 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" podStartSLOduration=8.284509878 podStartE2EDuration="11.667171989s" podCreationTimestamp="2026-04-24 22:34:53 +0000 UTC" firstStartedPulling="2026-04-24 22:35:01.125896101 +0000 UTC m=+293.117217983" lastFinishedPulling="2026-04-24 22:35:04.508558216 +0000 UTC m=+296.499880094" observedRunningTime="2026-04-24 22:35:04.666099023 +0000 UTC m=+296.657420923" watchObservedRunningTime="2026-04-24 22:35:04.667171989 +0000 UTC m=+296.658493939" Apr 24 22:35:04.682375 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:04.682321 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-qtls2" podStartSLOduration=8.924290962 podStartE2EDuration="12.682304725s" podCreationTimestamp="2026-04-24 22:34:52 +0000 UTC" firstStartedPulling="2026-04-24 22:35:00.750128871 +0000 UTC m=+292.741450749" lastFinishedPulling="2026-04-24 22:35:04.508142632 +0000 UTC m=+296.499464512" observedRunningTime="2026-04-24 22:35:04.681249546 +0000 UTC m=+296.672571447" watchObservedRunningTime="2026-04-24 22:35:04.682304725 +0000 UTC m=+296.673626670" Apr 24 22:35:08.467169 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:08.467145 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 22:35:13.611118 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:13.611083 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-896m9" Apr 24 22:35:15.657888 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:15.657852 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-zdwwn" Apr 24 22:35:17.625944 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:17.625907 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-cpwjv" Apr 24 22:35:25.656158 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:35:25.656125 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-qtls2" Apr 24 22:36:01.268199 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.268164 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-dxsvt"] Apr 24 22:36:01.271369 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.271354 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:01.278395 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.278375 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 22:36:01.279036 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.279020 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 22:36:01.279162 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.279044 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 22:36:01.279162 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.279130 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-s5mdp\"" Apr 24 22:36:01.283249 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.283230 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk"] Apr 24 22:36:01.286402 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.286379 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-dxsvt"] Apr 24 22:36:01.286501 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.286403 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" Apr 24 22:36:01.289961 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.289942 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 22:36:01.290118 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.290010 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-dqmt4\"" Apr 24 22:36:01.311575 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.311542 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk"] Apr 24 22:36:01.396022 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.395986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4526e70-49e7-4c60-ac11-447a12b62f19-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2zqjk\" (UID: \"d4526e70-49e7-4c60-ac11-447a12b62f19\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" Apr 24 22:36:01.396022 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.396026 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ce9a196-51f6-44cc-b942-cd9c90dde094-cert\") pod \"kserve-controller-manager-549bc44c6d-dxsvt\" (UID: \"9ce9a196-51f6-44cc-b942-cd9c90dde094\") " pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:01.396274 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.396045 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk6th\" (UniqueName: \"kubernetes.io/projected/9ce9a196-51f6-44cc-b942-cd9c90dde094-kube-api-access-mk6th\") pod \"kserve-controller-manager-549bc44c6d-dxsvt\" (UID: \"9ce9a196-51f6-44cc-b942-cd9c90dde094\") " pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:01.396274 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.396146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rfzn\" (UniqueName: \"kubernetes.io/projected/d4526e70-49e7-4c60-ac11-447a12b62f19-kube-api-access-4rfzn\") pod \"llmisvc-controller-manager-68cc5db7c4-2zqjk\" (UID: \"d4526e70-49e7-4c60-ac11-447a12b62f19\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" Apr 24 22:36:01.497611 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.497570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rfzn\" (UniqueName: \"kubernetes.io/projected/d4526e70-49e7-4c60-ac11-447a12b62f19-kube-api-access-4rfzn\") pod \"llmisvc-controller-manager-68cc5db7c4-2zqjk\" (UID: \"d4526e70-49e7-4c60-ac11-447a12b62f19\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" Apr 24 22:36:01.497803 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.497690 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4526e70-49e7-4c60-ac11-447a12b62f19-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2zqjk\" (UID: \"d4526e70-49e7-4c60-ac11-447a12b62f19\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" Apr 24 22:36:01.497803 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.497718 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ce9a196-51f6-44cc-b942-cd9c90dde094-cert\") pod \"kserve-controller-manager-549bc44c6d-dxsvt\" (UID: \"9ce9a196-51f6-44cc-b942-cd9c90dde094\") " pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:01.497803 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.497734 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mk6th\" (UniqueName: \"kubernetes.io/projected/9ce9a196-51f6-44cc-b942-cd9c90dde094-kube-api-access-mk6th\") pod \"kserve-controller-manager-549bc44c6d-dxsvt\" (UID: \"9ce9a196-51f6-44cc-b942-cd9c90dde094\") " pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:01.500315 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.500289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ce9a196-51f6-44cc-b942-cd9c90dde094-cert\") pod \"kserve-controller-manager-549bc44c6d-dxsvt\" (UID: \"9ce9a196-51f6-44cc-b942-cd9c90dde094\") " pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:01.500415 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.500355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d4526e70-49e7-4c60-ac11-447a12b62f19-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2zqjk\" (UID: \"d4526e70-49e7-4c60-ac11-447a12b62f19\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" Apr 24 22:36:01.507093 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.507063 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rfzn\" (UniqueName: \"kubernetes.io/projected/d4526e70-49e7-4c60-ac11-447a12b62f19-kube-api-access-4rfzn\") pod \"llmisvc-controller-manager-68cc5db7c4-2zqjk\" (UID: \"d4526e70-49e7-4c60-ac11-447a12b62f19\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" Apr 24 22:36:01.507316 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.507294 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk6th\" (UniqueName: \"kubernetes.io/projected/9ce9a196-51f6-44cc-b942-cd9c90dde094-kube-api-access-mk6th\") pod \"kserve-controller-manager-549bc44c6d-dxsvt\" (UID: \"9ce9a196-51f6-44cc-b942-cd9c90dde094\") " pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:01.581322 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.581234 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:01.596139 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.596111 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" Apr 24 22:36:01.720123 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.720097 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-dxsvt"] Apr 24 22:36:01.722500 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:36:01.722473 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ce9a196_51f6_44cc_b942_cd9c90dde094.slice/crio-62e22e3f621a9f781ffdb0d61e78babfe13439af7af4867f42774748117b4dbc WatchSource:0}: Error finding container 62e22e3f621a9f781ffdb0d61e78babfe13439af7af4867f42774748117b4dbc: Status 404 returned error can't find the container with id 62e22e3f621a9f781ffdb0d61e78babfe13439af7af4867f42774748117b4dbc Apr 24 22:36:01.723797 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.723769 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:36:01.750451 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.750425 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk"] Apr 24 22:36:01.753738 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:36:01.753698 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd4526e70_49e7_4c60_ac11_447a12b62f19.slice/crio-c371c968003c09d95ad2206726be12a81348d3876aa7393f8847ee38be2471d2 WatchSource:0}: Error finding container c371c968003c09d95ad2206726be12a81348d3876aa7393f8847ee38be2471d2: Status 404 returned error can't find the container with id c371c968003c09d95ad2206726be12a81348d3876aa7393f8847ee38be2471d2 Apr 24 22:36:01.814543 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.814510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" event={"ID":"d4526e70-49e7-4c60-ac11-447a12b62f19","Type":"ContainerStarted","Data":"c371c968003c09d95ad2206726be12a81348d3876aa7393f8847ee38be2471d2"} Apr 24 22:36:01.815573 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:01.815551 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" event={"ID":"9ce9a196-51f6-44cc-b942-cd9c90dde094","Type":"ContainerStarted","Data":"62e22e3f621a9f781ffdb0d61e78babfe13439af7af4867f42774748117b4dbc"} Apr 24 22:36:05.829843 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:05.829809 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" event={"ID":"d4526e70-49e7-4c60-ac11-447a12b62f19","Type":"ContainerStarted","Data":"600266350c89df97c0561638cd82ba2629e5326edf9d0a354a9c6ade4cbe7b4a"} Apr 24 22:36:05.830295 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:05.829874 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" Apr 24 22:36:05.831143 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:05.831121 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" event={"ID":"9ce9a196-51f6-44cc-b942-cd9c90dde094","Type":"ContainerStarted","Data":"0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171"} Apr 24 22:36:05.831273 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:05.831245 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:05.846886 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:05.846837 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" podStartSLOduration=1.555895453 podStartE2EDuration="4.846823859s" podCreationTimestamp="2026-04-24 22:36:01 +0000 UTC" firstStartedPulling="2026-04-24 22:36:01.755066564 +0000 UTC m=+353.746388442" lastFinishedPulling="2026-04-24 22:36:05.045994956 +0000 UTC m=+357.037316848" observedRunningTime="2026-04-24 22:36:05.846241034 +0000 UTC m=+357.837562935" watchObservedRunningTime="2026-04-24 22:36:05.846823859 +0000 UTC m=+357.838145759" Apr 24 22:36:05.868462 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:05.868403 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" podStartSLOduration=1.584772592 podStartE2EDuration="4.868387683s" podCreationTimestamp="2026-04-24 22:36:01 +0000 UTC" firstStartedPulling="2026-04-24 22:36:01.723892902 +0000 UTC m=+353.715214779" lastFinishedPulling="2026-04-24 22:36:05.007507978 +0000 UTC m=+356.998829870" observedRunningTime="2026-04-24 22:36:05.867803745 +0000 UTC m=+357.859125649" watchObservedRunningTime="2026-04-24 22:36:05.868387683 +0000 UTC m=+357.859709610" Apr 24 22:36:36.836302 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:36.836272 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2zqjk" Apr 24 22:36:36.839411 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:36.839396 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:38.001788 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.001741 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-dxsvt"] Apr 24 22:36:38.002273 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.002099 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" podUID="9ce9a196-51f6-44cc-b942-cd9c90dde094" containerName="manager" containerID="cri-o://0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171" gracePeriod=10 Apr 24 22:36:38.027831 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.027803 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-l7h8w"] Apr 24 22:36:38.031099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.031081 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" Apr 24 22:36:38.040019 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.039991 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-l7h8w"] Apr 24 22:36:38.119170 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.119140 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13de8ff4-3f03-4cf2-9d1c-ea9f64ace207-cert\") pod \"kserve-controller-manager-549bc44c6d-l7h8w\" (UID: \"13de8ff4-3f03-4cf2-9d1c-ea9f64ace207\") " pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" Apr 24 22:36:38.119321 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.119210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hr7w\" (UniqueName: \"kubernetes.io/projected/13de8ff4-3f03-4cf2-9d1c-ea9f64ace207-kube-api-access-9hr7w\") pod \"kserve-controller-manager-549bc44c6d-l7h8w\" (UID: \"13de8ff4-3f03-4cf2-9d1c-ea9f64ace207\") " pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" Apr 24 22:36:38.220134 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.220094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13de8ff4-3f03-4cf2-9d1c-ea9f64ace207-cert\") pod \"kserve-controller-manager-549bc44c6d-l7h8w\" (UID: \"13de8ff4-3f03-4cf2-9d1c-ea9f64ace207\") " pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" Apr 24 22:36:38.220311 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.220183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hr7w\" (UniqueName: \"kubernetes.io/projected/13de8ff4-3f03-4cf2-9d1c-ea9f64ace207-kube-api-access-9hr7w\") pod \"kserve-controller-manager-549bc44c6d-l7h8w\" (UID: \"13de8ff4-3f03-4cf2-9d1c-ea9f64ace207\") " pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" Apr 24 22:36:38.222630 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.222583 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13de8ff4-3f03-4cf2-9d1c-ea9f64ace207-cert\") pod \"kserve-controller-manager-549bc44c6d-l7h8w\" (UID: \"13de8ff4-3f03-4cf2-9d1c-ea9f64ace207\") " pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" Apr 24 22:36:38.229446 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.229410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hr7w\" (UniqueName: \"kubernetes.io/projected/13de8ff4-3f03-4cf2-9d1c-ea9f64ace207-kube-api-access-9hr7w\") pod \"kserve-controller-manager-549bc44c6d-l7h8w\" (UID: \"13de8ff4-3f03-4cf2-9d1c-ea9f64ace207\") " pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" Apr 24 22:36:38.242490 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.242467 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:38.321638 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.321523 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk6th\" (UniqueName: \"kubernetes.io/projected/9ce9a196-51f6-44cc-b942-cd9c90dde094-kube-api-access-mk6th\") pod \"9ce9a196-51f6-44cc-b942-cd9c90dde094\" (UID: \"9ce9a196-51f6-44cc-b942-cd9c90dde094\") " Apr 24 22:36:38.321638 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.321560 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ce9a196-51f6-44cc-b942-cd9c90dde094-cert\") pod \"9ce9a196-51f6-44cc-b942-cd9c90dde094\" (UID: \"9ce9a196-51f6-44cc-b942-cd9c90dde094\") " Apr 24 22:36:38.323784 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.323757 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce9a196-51f6-44cc-b942-cd9c90dde094-cert" (OuterVolumeSpecName: "cert") pod "9ce9a196-51f6-44cc-b942-cd9c90dde094" (UID: "9ce9a196-51f6-44cc-b942-cd9c90dde094"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:36:38.323880 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.323850 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce9a196-51f6-44cc-b942-cd9c90dde094-kube-api-access-mk6th" (OuterVolumeSpecName: "kube-api-access-mk6th") pod "9ce9a196-51f6-44cc-b942-cd9c90dde094" (UID: "9ce9a196-51f6-44cc-b942-cd9c90dde094"). InnerVolumeSpecName "kube-api-access-mk6th". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:36:38.384615 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.384567 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" Apr 24 22:36:38.422499 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.422460 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mk6th\" (UniqueName: \"kubernetes.io/projected/9ce9a196-51f6-44cc-b942-cd9c90dde094-kube-api-access-mk6th\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:36:38.422499 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.422494 2573 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ce9a196-51f6-44cc-b942-cd9c90dde094-cert\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:36:38.510967 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.510940 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-l7h8w"] Apr 24 22:36:38.512952 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:36:38.512921 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13de8ff4_3f03_4cf2_9d1c_ea9f64ace207.slice/crio-a7dc17e736f9ede34a117c592f9a45c4d041f26034eacf48adda8f6338814cca WatchSource:0}: Error finding container a7dc17e736f9ede34a117c592f9a45c4d041f26034eacf48adda8f6338814cca: Status 404 returned error can't find the container with id a7dc17e736f9ede34a117c592f9a45c4d041f26034eacf48adda8f6338814cca Apr 24 22:36:38.937725 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.937695 2573 generic.go:358] "Generic (PLEG): container finished" podID="9ce9a196-51f6-44cc-b942-cd9c90dde094" containerID="0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171" exitCode=0 Apr 24 22:36:38.937913 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.937774 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" Apr 24 22:36:38.937913 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.937791 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" event={"ID":"9ce9a196-51f6-44cc-b942-cd9c90dde094","Type":"ContainerDied","Data":"0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171"} Apr 24 22:36:38.937913 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.937826 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-dxsvt" event={"ID":"9ce9a196-51f6-44cc-b942-cd9c90dde094","Type":"ContainerDied","Data":"62e22e3f621a9f781ffdb0d61e78babfe13439af7af4867f42774748117b4dbc"} Apr 24 22:36:38.937913 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.937844 2573 scope.go:117] "RemoveContainer" containerID="0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171" Apr 24 22:36:38.939512 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.939491 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" event={"ID":"13de8ff4-3f03-4cf2-9d1c-ea9f64ace207","Type":"ContainerStarted","Data":"54a901286f70c6759dcc8c91baf34b191cefcb33dc9013ef23b53585d24607bf"} Apr 24 22:36:38.939664 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.939522 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" event={"ID":"13de8ff4-3f03-4cf2-9d1c-ea9f64ace207","Type":"ContainerStarted","Data":"a7dc17e736f9ede34a117c592f9a45c4d041f26034eacf48adda8f6338814cca"} Apr 24 22:36:38.939728 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.939683 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" Apr 24 22:36:38.946099 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.946076 2573 scope.go:117] "RemoveContainer" containerID="0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171" Apr 24 22:36:38.946372 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:36:38.946353 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171\": container with ID starting with 0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171 not found: ID does not exist" containerID="0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171" Apr 24 22:36:38.946427 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.946379 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171"} err="failed to get container status \"0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171\": rpc error: code = NotFound desc = could not find container \"0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171\": container with ID starting with 0db267f4d677df482dd63ab551570df8b15488622d3d508c06440731db1ad171 not found: ID does not exist" Apr 24 22:36:38.961774 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.961726 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" podStartSLOduration=0.607894554 podStartE2EDuration="961.713273ms" podCreationTimestamp="2026-04-24 22:36:38 +0000 UTC" firstStartedPulling="2026-04-24 22:36:38.514168171 +0000 UTC m=+390.505490050" lastFinishedPulling="2026-04-24 22:36:38.867986883 +0000 UTC m=+390.859308769" observedRunningTime="2026-04-24 22:36:38.961024655 +0000 UTC m=+390.952346568" watchObservedRunningTime="2026-04-24 22:36:38.961713273 +0000 UTC m=+390.953035172" Apr 24 22:36:38.974021 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.973987 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-dxsvt"] Apr 24 22:36:38.975907 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:38.975883 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-549bc44c6d-dxsvt"] Apr 24 22:36:40.567413 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:36:40.567377 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ce9a196-51f6-44cc-b942-cd9c90dde094" path="/var/lib/kubelet/pods/9ce9a196-51f6-44cc-b942-cd9c90dde094/volumes" Apr 24 22:37:09.948698 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:09.948670 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-549bc44c6d-l7h8w" Apr 24 22:37:10.769118 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.769077 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-4dt9g"] Apr 24 22:37:10.769480 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.769454 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ce9a196-51f6-44cc-b942-cd9c90dde094" containerName="manager" Apr 24 22:37:10.769480 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.769474 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce9a196-51f6-44cc-b942-cd9c90dde094" containerName="manager" Apr 24 22:37:10.769681 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.769559 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ce9a196-51f6-44cc-b942-cd9c90dde094" containerName="manager" Apr 24 22:37:10.773995 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.773978 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-4dt9g" Apr 24 22:37:10.776385 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.776359 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 22:37:10.776385 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.776381 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-4n2r2\"" Apr 24 22:37:10.777324 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.777307 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-4ctx7"] Apr 24 22:37:10.781294 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.781270 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-4ctx7" Apr 24 22:37:10.781294 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.781284 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-4dt9g"] Apr 24 22:37:10.783562 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.783543 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 22:37:10.783712 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.783646 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-x4r6q\"" Apr 24 22:37:10.792879 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.792854 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-4ctx7"] Apr 24 22:37:10.889656 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.889623 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/06e263d9-56d4-4186-a8a3-aed4cbc1c175-tls-certs\") pod \"model-serving-api-86f7b4b499-4dt9g\" (UID: \"06e263d9-56d4-4186-a8a3-aed4cbc1c175\") " pod="kserve/model-serving-api-86f7b4b499-4dt9g" Apr 24 22:37:10.889861 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.889687 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsp9j\" (UniqueName: \"kubernetes.io/projected/0ef77a69-67af-4e1f-b012-5390274c2187-kube-api-access-tsp9j\") pod \"odh-model-controller-696fc77849-4ctx7\" (UID: \"0ef77a69-67af-4e1f-b012-5390274c2187\") " pod="kserve/odh-model-controller-696fc77849-4ctx7" Apr 24 22:37:10.889861 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.889754 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4ttr\" (UniqueName: \"kubernetes.io/projected/06e263d9-56d4-4186-a8a3-aed4cbc1c175-kube-api-access-k4ttr\") pod \"model-serving-api-86f7b4b499-4dt9g\" (UID: \"06e263d9-56d4-4186-a8a3-aed4cbc1c175\") " pod="kserve/model-serving-api-86f7b4b499-4dt9g" Apr 24 22:37:10.889861 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.889804 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef77a69-67af-4e1f-b012-5390274c2187-cert\") pod \"odh-model-controller-696fc77849-4ctx7\" (UID: \"0ef77a69-67af-4e1f-b012-5390274c2187\") " pod="kserve/odh-model-controller-696fc77849-4ctx7" Apr 24 22:37:10.990932 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.990892 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4ttr\" (UniqueName: \"kubernetes.io/projected/06e263d9-56d4-4186-a8a3-aed4cbc1c175-kube-api-access-k4ttr\") pod \"model-serving-api-86f7b4b499-4dt9g\" (UID: \"06e263d9-56d4-4186-a8a3-aed4cbc1c175\") " pod="kserve/model-serving-api-86f7b4b499-4dt9g" Apr 24 22:37:10.990932 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.990932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef77a69-67af-4e1f-b012-5390274c2187-cert\") pod \"odh-model-controller-696fc77849-4ctx7\" (UID: \"0ef77a69-67af-4e1f-b012-5390274c2187\") " pod="kserve/odh-model-controller-696fc77849-4ctx7" Apr 24 22:37:10.991355 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.990975 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/06e263d9-56d4-4186-a8a3-aed4cbc1c175-tls-certs\") pod \"model-serving-api-86f7b4b499-4dt9g\" (UID: \"06e263d9-56d4-4186-a8a3-aed4cbc1c175\") " pod="kserve/model-serving-api-86f7b4b499-4dt9g" Apr 24 22:37:10.991355 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.991014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsp9j\" (UniqueName: \"kubernetes.io/projected/0ef77a69-67af-4e1f-b012-5390274c2187-kube-api-access-tsp9j\") pod \"odh-model-controller-696fc77849-4ctx7\" (UID: \"0ef77a69-67af-4e1f-b012-5390274c2187\") " pod="kserve/odh-model-controller-696fc77849-4ctx7" Apr 24 22:37:10.991355 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:37:10.991117 2573 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 22:37:10.991355 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:37:10.991201 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06e263d9-56d4-4186-a8a3-aed4cbc1c175-tls-certs podName:06e263d9-56d4-4186-a8a3-aed4cbc1c175 nodeName:}" failed. No retries permitted until 2026-04-24 22:37:11.491179141 +0000 UTC m=+423.482501019 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/06e263d9-56d4-4186-a8a3-aed4cbc1c175-tls-certs") pod "model-serving-api-86f7b4b499-4dt9g" (UID: "06e263d9-56d4-4186-a8a3-aed4cbc1c175") : secret "model-serving-api-tls" not found Apr 24 22:37:10.993534 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:10.993519 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef77a69-67af-4e1f-b012-5390274c2187-cert\") pod \"odh-model-controller-696fc77849-4ctx7\" (UID: \"0ef77a69-67af-4e1f-b012-5390274c2187\") " pod="kserve/odh-model-controller-696fc77849-4ctx7" Apr 24 22:37:11.005112 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:11.005080 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4ttr\" (UniqueName: \"kubernetes.io/projected/06e263d9-56d4-4186-a8a3-aed4cbc1c175-kube-api-access-k4ttr\") pod \"model-serving-api-86f7b4b499-4dt9g\" (UID: \"06e263d9-56d4-4186-a8a3-aed4cbc1c175\") " pod="kserve/model-serving-api-86f7b4b499-4dt9g" Apr 24 22:37:11.005455 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:11.005436 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsp9j\" (UniqueName: \"kubernetes.io/projected/0ef77a69-67af-4e1f-b012-5390274c2187-kube-api-access-tsp9j\") pod \"odh-model-controller-696fc77849-4ctx7\" (UID: \"0ef77a69-67af-4e1f-b012-5390274c2187\") " pod="kserve/odh-model-controller-696fc77849-4ctx7" Apr 24 22:37:11.093823 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:11.093729 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-4ctx7" Apr 24 22:37:11.226352 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:11.226328 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-4ctx7"] Apr 24 22:37:11.228546 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:37:11.228517 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ef77a69_67af_4e1f_b012_5390274c2187.slice/crio-3930390783d49ddd8ca4629ad8fe5919375c40481a5f1f1152757c10669a8c47 WatchSource:0}: Error finding container 3930390783d49ddd8ca4629ad8fe5919375c40481a5f1f1152757c10669a8c47: Status 404 returned error can't find the container with id 3930390783d49ddd8ca4629ad8fe5919375c40481a5f1f1152757c10669a8c47 Apr 24 22:37:11.496106 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:11.496074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/06e263d9-56d4-4186-a8a3-aed4cbc1c175-tls-certs\") pod \"model-serving-api-86f7b4b499-4dt9g\" (UID: \"06e263d9-56d4-4186-a8a3-aed4cbc1c175\") " pod="kserve/model-serving-api-86f7b4b499-4dt9g" Apr 24 22:37:11.498639 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:11.498618 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/06e263d9-56d4-4186-a8a3-aed4cbc1c175-tls-certs\") pod \"model-serving-api-86f7b4b499-4dt9g\" (UID: \"06e263d9-56d4-4186-a8a3-aed4cbc1c175\") " pod="kserve/model-serving-api-86f7b4b499-4dt9g" Apr 24 22:37:11.685607 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:11.685566 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-4dt9g" Apr 24 22:37:11.833906 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:11.833849 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-4dt9g"] Apr 24 22:37:11.836943 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:37:11.836868 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06e263d9_56d4_4186_a8a3_aed4cbc1c175.slice/crio-32fe3c4cb2ef869e24e424902e9093b0bc13ad5304c24665551da2ef9d8abb10 WatchSource:0}: Error finding container 32fe3c4cb2ef869e24e424902e9093b0bc13ad5304c24665551da2ef9d8abb10: Status 404 returned error can't find the container with id 32fe3c4cb2ef869e24e424902e9093b0bc13ad5304c24665551da2ef9d8abb10 Apr 24 22:37:12.045864 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:12.045770 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-4ctx7" event={"ID":"0ef77a69-67af-4e1f-b012-5390274c2187","Type":"ContainerStarted","Data":"3930390783d49ddd8ca4629ad8fe5919375c40481a5f1f1152757c10669a8c47"} Apr 24 22:37:12.047023 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:12.046992 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-4dt9g" event={"ID":"06e263d9-56d4-4186-a8a3-aed4cbc1c175","Type":"ContainerStarted","Data":"32fe3c4cb2ef869e24e424902e9093b0bc13ad5304c24665551da2ef9d8abb10"} Apr 24 22:37:15.059645 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:15.059575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-4ctx7" event={"ID":"0ef77a69-67af-4e1f-b012-5390274c2187","Type":"ContainerStarted","Data":"3dc6fcca69e35bcf9f2d6e21826e07825394d03664149e82727123ece30fbbce"} Apr 24 22:37:15.060088 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:15.059709 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-4ctx7" Apr 24 22:37:15.061075 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:15.061047 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-4dt9g" event={"ID":"06e263d9-56d4-4186-a8a3-aed4cbc1c175","Type":"ContainerStarted","Data":"501e77b3e2f0730d0619f8c4fd5ecb99321f3e70944dd8449aa695abb8c523e1"} Apr 24 22:37:15.061207 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:15.061191 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-4dt9g" Apr 24 22:37:15.082371 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:15.082319 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-4ctx7" podStartSLOduration=2.197665758 podStartE2EDuration="5.082305438s" podCreationTimestamp="2026-04-24 22:37:10 +0000 UTC" firstStartedPulling="2026-04-24 22:37:11.229883212 +0000 UTC m=+423.221205090" lastFinishedPulling="2026-04-24 22:37:14.114522887 +0000 UTC m=+426.105844770" observedRunningTime="2026-04-24 22:37:15.080878995 +0000 UTC m=+427.072200912" watchObservedRunningTime="2026-04-24 22:37:15.082305438 +0000 UTC m=+427.073627338" Apr 24 22:37:15.101906 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:15.101862 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-4dt9g" podStartSLOduration=2.829011642 podStartE2EDuration="5.101849151s" podCreationTimestamp="2026-04-24 22:37:10 +0000 UTC" firstStartedPulling="2026-04-24 22:37:11.83928794 +0000 UTC m=+423.830609818" lastFinishedPulling="2026-04-24 22:37:14.112125449 +0000 UTC m=+426.103447327" observedRunningTime="2026-04-24 22:37:15.099876453 +0000 UTC m=+427.091198350" watchObservedRunningTime="2026-04-24 22:37:15.101849151 +0000 UTC m=+427.093171050" Apr 24 22:37:26.067184 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:26.067100 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-4ctx7" Apr 24 22:37:26.069033 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:26.069015 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-4dt9g" Apr 24 22:37:27.624078 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.624045 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84bd6c94cf-5trg4"] Apr 24 22:37:27.627371 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.627353 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.639500 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.639472 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84bd6c94cf-5trg4"] Apr 24 22:37:27.733996 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.733953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-console-oauth-config\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.733996 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.734002 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-oauth-serving-cert\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.734266 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.734045 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-service-ca\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.734266 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.734075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-console-config\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.734266 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.734105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-console-serving-cert\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.734266 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.734168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-trusted-ca-bundle\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.734266 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.734206 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qrjd\" (UniqueName: \"kubernetes.io/projected/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-kube-api-access-9qrjd\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.835591 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.835544 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-service-ca\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.835591 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.835627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-console-config\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.835821 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.835656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-console-serving-cert\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.835821 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.835695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-trusted-ca-bundle\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.835821 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.835720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qrjd\" (UniqueName: \"kubernetes.io/projected/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-kube-api-access-9qrjd\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.835821 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.835770 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-console-oauth-config\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.835821 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.835799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-oauth-serving-cert\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.836411 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.836382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-service-ca\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.836411 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.836434 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-console-config\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.836411 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.836443 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-oauth-serving-cert\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.836822 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.836527 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-trusted-ca-bundle\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.838303 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.838275 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-console-serving-cert\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.838396 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.838373 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-console-oauth-config\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.844182 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.844166 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qrjd\" (UniqueName: \"kubernetes.io/projected/42ad3bc8-fb68-4b2c-9463-78d5df8c58e8-kube-api-access-9qrjd\") pod \"console-84bd6c94cf-5trg4\" (UID: \"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8\") " pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:27.936510 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:27.936420 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:28.064391 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:28.064359 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84bd6c94cf-5trg4"] Apr 24 22:37:28.066865 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:37:28.066837 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ad3bc8_fb68_4b2c_9463_78d5df8c58e8.slice/crio-83e5cb0e7ff8495bd782a6846020a78322cc0b9b561fb9e849d3361d521088f9 WatchSource:0}: Error finding container 83e5cb0e7ff8495bd782a6846020a78322cc0b9b561fb9e849d3361d521088f9: Status 404 returned error can't find the container with id 83e5cb0e7ff8495bd782a6846020a78322cc0b9b561fb9e849d3361d521088f9 Apr 24 22:37:28.103468 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:28.103440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84bd6c94cf-5trg4" event={"ID":"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8","Type":"ContainerStarted","Data":"83e5cb0e7ff8495bd782a6846020a78322cc0b9b561fb9e849d3361d521088f9"} Apr 24 22:37:29.111057 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:29.111016 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84bd6c94cf-5trg4" event={"ID":"42ad3bc8-fb68-4b2c-9463-78d5df8c58e8","Type":"ContainerStarted","Data":"0f2e07ced9bc4233d536a07cc10cea2555de1dbd7fc304fa12d1585df38c0b4a"} Apr 24 22:37:29.129734 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:29.129679 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84bd6c94cf-5trg4" podStartSLOduration=2.1296632620000002 podStartE2EDuration="2.129663262s" podCreationTimestamp="2026-04-24 22:37:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:37:29.128931006 +0000 UTC m=+441.120252906" watchObservedRunningTime="2026-04-24 22:37:29.129663262 +0000 UTC m=+441.120985159" Apr 24 22:37:37.936992 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:37.936948 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:37.936992 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:37.937002 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:37.941633 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:37.941587 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:38.140132 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.140097 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw"] Apr 24 22:37:38.143671 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.143651 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" Apr 24 22:37:38.143799 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.143719 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84bd6c94cf-5trg4" Apr 24 22:37:38.146297 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.146277 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-dqlzb\"" Apr 24 22:37:38.146509 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.146494 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 22:37:38.151774 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.151751 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw"] Apr 24 22:37:38.193855 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.193781 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69b4f86655-v7j2c"] Apr 24 22:37:38.331706 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.331668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj85f\" (UniqueName: \"kubernetes.io/projected/5e29ec13-8916-46b6-8de0-0183a1aacf8b-kube-api-access-bj85f\") pod \"seaweedfs-tls-custom-ddd4dbfd-4xcbw\" (UID: \"5e29ec13-8916-46b6-8de0-0183a1aacf8b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" Apr 24 22:37:38.331883 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.331718 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5e29ec13-8916-46b6-8de0-0183a1aacf8b-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-4xcbw\" (UID: \"5e29ec13-8916-46b6-8de0-0183a1aacf8b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" Apr 24 22:37:38.432854 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.432813 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5e29ec13-8916-46b6-8de0-0183a1aacf8b-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-4xcbw\" (UID: \"5e29ec13-8916-46b6-8de0-0183a1aacf8b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" Apr 24 22:37:38.433057 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.432911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj85f\" (UniqueName: \"kubernetes.io/projected/5e29ec13-8916-46b6-8de0-0183a1aacf8b-kube-api-access-bj85f\") pod \"seaweedfs-tls-custom-ddd4dbfd-4xcbw\" (UID: \"5e29ec13-8916-46b6-8de0-0183a1aacf8b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" Apr 24 22:37:38.433267 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.433244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5e29ec13-8916-46b6-8de0-0183a1aacf8b-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-4xcbw\" (UID: \"5e29ec13-8916-46b6-8de0-0183a1aacf8b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" Apr 24 22:37:38.441764 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.441736 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj85f\" (UniqueName: \"kubernetes.io/projected/5e29ec13-8916-46b6-8de0-0183a1aacf8b-kube-api-access-bj85f\") pod \"seaweedfs-tls-custom-ddd4dbfd-4xcbw\" (UID: \"5e29ec13-8916-46b6-8de0-0183a1aacf8b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" Apr 24 22:37:38.453758 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.453683 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" Apr 24 22:37:38.575742 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:38.575720 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw"] Apr 24 22:37:38.578017 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:37:38.577995 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e29ec13_8916_46b6_8de0_0183a1aacf8b.slice/crio-4ec0ec3e48cd2b9a20718e52a7c1718dc5be9cef6a1ebefdf294d310b91fb564 WatchSource:0}: Error finding container 4ec0ec3e48cd2b9a20718e52a7c1718dc5be9cef6a1ebefdf294d310b91fb564: Status 404 returned error can't find the container with id 4ec0ec3e48cd2b9a20718e52a7c1718dc5be9cef6a1ebefdf294d310b91fb564 Apr 24 22:37:39.144782 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:39.144743 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" event={"ID":"5e29ec13-8916-46b6-8de0-0183a1aacf8b","Type":"ContainerStarted","Data":"4ec0ec3e48cd2b9a20718e52a7c1718dc5be9cef6a1ebefdf294d310b91fb564"} Apr 24 22:37:42.157945 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:42.157911 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" event={"ID":"5e29ec13-8916-46b6-8de0-0183a1aacf8b","Type":"ContainerStarted","Data":"fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2"} Apr 24 22:37:42.177452 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:42.177396 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" podStartSLOduration=1.6253656730000001 podStartE2EDuration="4.177381593s" podCreationTimestamp="2026-04-24 22:37:38 +0000 UTC" firstStartedPulling="2026-04-24 22:37:38.579320365 +0000 UTC m=+450.570642258" lastFinishedPulling="2026-04-24 22:37:41.131336286 +0000 UTC m=+453.122658178" observedRunningTime="2026-04-24 22:37:42.176442776 +0000 UTC m=+454.167764676" watchObservedRunningTime="2026-04-24 22:37:42.177381593 +0000 UTC m=+454.168703492" Apr 24 22:37:43.929723 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:43.929687 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw"] Apr 24 22:37:44.164582 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:37:44.164537 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" podUID="5e29ec13-8916-46b6-8de0-0183a1aacf8b" containerName="seaweedfs-tls-custom" containerID="cri-o://fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2" gracePeriod=30 Apr 24 22:38:03.215261 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.215181 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69b4f86655-v7j2c" podUID="6e031001-588c-46cf-8201-3b36d4d7f4a9" containerName="console" containerID="cri-o://6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183" gracePeriod=15 Apr 24 22:38:03.454670 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.454648 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69b4f86655-v7j2c_6e031001-588c-46cf-8201-3b36d4d7f4a9/console/0.log" Apr 24 22:38:03.454804 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.454716 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:38:03.543827 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.543788 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-config\") pod \"6e031001-588c-46cf-8201-3b36d4d7f4a9\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " Apr 24 22:38:03.544016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.543882 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgxz7\" (UniqueName: \"kubernetes.io/projected/6e031001-588c-46cf-8201-3b36d4d7f4a9-kube-api-access-cgxz7\") pod \"6e031001-588c-46cf-8201-3b36d4d7f4a9\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " Apr 24 22:38:03.544016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.543923 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-oauth-serving-cert\") pod \"6e031001-588c-46cf-8201-3b36d4d7f4a9\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " Apr 24 22:38:03.544016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.543951 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-trusted-ca-bundle\") pod \"6e031001-588c-46cf-8201-3b36d4d7f4a9\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " Apr 24 22:38:03.544016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.543977 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-serving-cert\") pod \"6e031001-588c-46cf-8201-3b36d4d7f4a9\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " Apr 24 22:38:03.544016 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.544004 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-service-ca\") pod \"6e031001-588c-46cf-8201-3b36d4d7f4a9\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " Apr 24 22:38:03.544259 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.544046 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-oauth-config\") pod \"6e031001-588c-46cf-8201-3b36d4d7f4a9\" (UID: \"6e031001-588c-46cf-8201-3b36d4d7f4a9\") " Apr 24 22:38:03.544340 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.544318 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-config" (OuterVolumeSpecName: "console-config") pod "6e031001-588c-46cf-8201-3b36d4d7f4a9" (UID: "6e031001-588c-46cf-8201-3b36d4d7f4a9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:03.544414 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.544384 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6e031001-588c-46cf-8201-3b36d4d7f4a9" (UID: "6e031001-588c-46cf-8201-3b36d4d7f4a9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:03.544466 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.544427 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-service-ca" (OuterVolumeSpecName: "service-ca") pod "6e031001-588c-46cf-8201-3b36d4d7f4a9" (UID: "6e031001-588c-46cf-8201-3b36d4d7f4a9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:03.544466 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.544450 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6e031001-588c-46cf-8201-3b36d4d7f4a9" (UID: "6e031001-588c-46cf-8201-3b36d4d7f4a9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:38:03.546466 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.546438 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e031001-588c-46cf-8201-3b36d4d7f4a9-kube-api-access-cgxz7" (OuterVolumeSpecName: "kube-api-access-cgxz7") pod "6e031001-588c-46cf-8201-3b36d4d7f4a9" (UID: "6e031001-588c-46cf-8201-3b36d4d7f4a9"). InnerVolumeSpecName "kube-api-access-cgxz7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:38:03.546577 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.546442 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6e031001-588c-46cf-8201-3b36d4d7f4a9" (UID: "6e031001-588c-46cf-8201-3b36d4d7f4a9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:38:03.546577 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.546505 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6e031001-588c-46cf-8201-3b36d4d7f4a9" (UID: "6e031001-588c-46cf-8201-3b36d4d7f4a9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:38:03.645511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.645473 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cgxz7\" (UniqueName: \"kubernetes.io/projected/6e031001-588c-46cf-8201-3b36d4d7f4a9-kube-api-access-cgxz7\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:38:03.645511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.645506 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-oauth-serving-cert\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:38:03.645511 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.645516 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-trusted-ca-bundle\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:38:03.645768 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.645524 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-serving-cert\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:38:03.645768 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.645534 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-service-ca\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:38:03.645768 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.645544 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-oauth-config\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:38:03.645768 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:03.645553 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e031001-588c-46cf-8201-3b36d4d7f4a9-console-config\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:38:04.230282 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:04.230256 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69b4f86655-v7j2c_6e031001-588c-46cf-8201-3b36d4d7f4a9/console/0.log" Apr 24 22:38:04.230708 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:04.230293 2573 generic.go:358] "Generic (PLEG): container finished" podID="6e031001-588c-46cf-8201-3b36d4d7f4a9" containerID="6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183" exitCode=2 Apr 24 22:38:04.230708 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:04.230363 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b4f86655-v7j2c" Apr 24 22:38:04.230708 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:04.230380 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b4f86655-v7j2c" event={"ID":"6e031001-588c-46cf-8201-3b36d4d7f4a9","Type":"ContainerDied","Data":"6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183"} Apr 24 22:38:04.230708 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:04.230415 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b4f86655-v7j2c" event={"ID":"6e031001-588c-46cf-8201-3b36d4d7f4a9","Type":"ContainerDied","Data":"9f1df0613ba3540ef548118cb56da2f0f8632bdc42337c74109b7136be66613b"} Apr 24 22:38:04.230708 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:04.230431 2573 scope.go:117] "RemoveContainer" containerID="6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183" Apr 24 22:38:04.238999 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:04.238979 2573 scope.go:117] "RemoveContainer" containerID="6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183" Apr 24 22:38:04.239244 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:38:04.239225 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183\": container with ID starting with 6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183 not found: ID does not exist" containerID="6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183" Apr 24 22:38:04.239302 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:04.239256 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183"} err="failed to get container status \"6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183\": rpc error: code = NotFound desc = could not find container \"6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183\": container with ID starting with 6f02288dfbf7518833dd2aeb266e0ca141ff0e22cef6fb2283e095b821385183 not found: ID does not exist" Apr 24 22:38:04.267041 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:04.267001 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69b4f86655-v7j2c"] Apr 24 22:38:04.272899 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:04.272867 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69b4f86655-v7j2c"] Apr 24 22:38:04.567948 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:04.567868 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e031001-588c-46cf-8201-3b36d4d7f4a9" path="/var/lib/kubelet/pods/6e031001-588c-46cf-8201-3b36d4d7f4a9/volumes" Apr 24 22:38:11.709472 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:38:11.709436 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e29ec13_8916_46b6_8de0_0183a1aacf8b.slice/crio-4ec0ec3e48cd2b9a20718e52a7c1718dc5be9cef6a1ebefdf294d310b91fb564\": RecentStats: unable to find data in memory cache]" Apr 24 22:38:11.806107 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:11.806078 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" Apr 24 22:38:11.915211 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:11.915132 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5e29ec13-8916-46b6-8de0-0183a1aacf8b-data\") pod \"5e29ec13-8916-46b6-8de0-0183a1aacf8b\" (UID: \"5e29ec13-8916-46b6-8de0-0183a1aacf8b\") " Apr 24 22:38:11.915211 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:11.915194 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj85f\" (UniqueName: \"kubernetes.io/projected/5e29ec13-8916-46b6-8de0-0183a1aacf8b-kube-api-access-bj85f\") pod \"5e29ec13-8916-46b6-8de0-0183a1aacf8b\" (UID: \"5e29ec13-8916-46b6-8de0-0183a1aacf8b\") " Apr 24 22:38:11.916407 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:11.916373 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e29ec13-8916-46b6-8de0-0183a1aacf8b-data" (OuterVolumeSpecName: "data") pod "5e29ec13-8916-46b6-8de0-0183a1aacf8b" (UID: "5e29ec13-8916-46b6-8de0-0183a1aacf8b"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:38:11.917458 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:11.917423 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e29ec13-8916-46b6-8de0-0183a1aacf8b-kube-api-access-bj85f" (OuterVolumeSpecName: "kube-api-access-bj85f") pod "5e29ec13-8916-46b6-8de0-0183a1aacf8b" (UID: "5e29ec13-8916-46b6-8de0-0183a1aacf8b"). InnerVolumeSpecName "kube-api-access-bj85f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:38:12.016200 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.016165 2573 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5e29ec13-8916-46b6-8de0-0183a1aacf8b-data\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:38:12.016200 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.016193 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bj85f\" (UniqueName: \"kubernetes.io/projected/5e29ec13-8916-46b6-8de0-0183a1aacf8b-kube-api-access-bj85f\") on node \"ip-10-0-142-173.ec2.internal\" DevicePath \"\"" Apr 24 22:38:12.256930 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.256894 2573 generic.go:358] "Generic (PLEG): container finished" podID="5e29ec13-8916-46b6-8de0-0183a1aacf8b" containerID="fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2" exitCode=0 Apr 24 22:38:12.257109 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.256957 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" Apr 24 22:38:12.257109 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.256987 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" event={"ID":"5e29ec13-8916-46b6-8de0-0183a1aacf8b","Type":"ContainerDied","Data":"fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2"} Apr 24 22:38:12.257109 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.257029 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw" event={"ID":"5e29ec13-8916-46b6-8de0-0183a1aacf8b","Type":"ContainerDied","Data":"4ec0ec3e48cd2b9a20718e52a7c1718dc5be9cef6a1ebefdf294d310b91fb564"} Apr 24 22:38:12.257109 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.257049 2573 scope.go:117] "RemoveContainer" containerID="fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2" Apr 24 22:38:12.266328 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.266306 2573 scope.go:117] "RemoveContainer" containerID="fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2" Apr 24 22:38:12.266578 ip-10-0-142-173 kubenswrapper[2573]: E0424 22:38:12.266558 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2\": container with ID starting with fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2 not found: ID does not exist" containerID="fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2" Apr 24 22:38:12.266651 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.266588 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2"} err="failed to get container status \"fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2\": rpc error: code = NotFound desc = could not find container \"fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2\": container with ID starting with fa2a0e170687ea2b23ed10330860287aa97debdfd183dd930fa0d2febd5f55f2 not found: ID does not exist" Apr 24 22:38:12.276637 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.276584 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw"] Apr 24 22:38:12.278808 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.278781 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-4xcbw"] Apr 24 22:38:12.306696 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.306667 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j"] Apr 24 22:38:12.307038 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.307025 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e29ec13-8916-46b6-8de0-0183a1aacf8b" containerName="seaweedfs-tls-custom" Apr 24 22:38:12.307080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.307040 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e29ec13-8916-46b6-8de0-0183a1aacf8b" containerName="seaweedfs-tls-custom" Apr 24 22:38:12.307080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.307060 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e031001-588c-46cf-8201-3b36d4d7f4a9" containerName="console" Apr 24 22:38:12.307080 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.307066 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e031001-588c-46cf-8201-3b36d4d7f4a9" containerName="console" Apr 24 22:38:12.307174 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.307122 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e29ec13-8916-46b6-8de0-0183a1aacf8b" containerName="seaweedfs-tls-custom" Apr 24 22:38:12.307174 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.307131 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e031001-588c-46cf-8201-3b36d4d7f4a9" containerName="console" Apr 24 22:38:12.326550 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.326519 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j"] Apr 24 22:38:12.326725 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.326657 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" Apr 24 22:38:12.329259 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.329228 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 22:38:12.329259 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.329257 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 24 22:38:12.329447 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.329269 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-dqlzb\"" Apr 24 22:38:12.418868 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.418830 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlp7h\" (UniqueName: \"kubernetes.io/projected/9542d4ef-eaa1-4706-b175-1955ac9078c5-kube-api-access-rlp7h\") pod \"seaweedfs-tls-custom-5c88b85bb7-45m6j\" (UID: \"9542d4ef-eaa1-4706-b175-1955ac9078c5\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" Apr 24 22:38:12.418868 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.418876 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/9542d4ef-eaa1-4706-b175-1955ac9078c5-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-45m6j\" (UID: \"9542d4ef-eaa1-4706-b175-1955ac9078c5\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" Apr 24 22:38:12.419097 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.418897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9542d4ef-eaa1-4706-b175-1955ac9078c5-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-45m6j\" (UID: \"9542d4ef-eaa1-4706-b175-1955ac9078c5\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" Apr 24 22:38:12.520227 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.520139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/9542d4ef-eaa1-4706-b175-1955ac9078c5-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-45m6j\" (UID: \"9542d4ef-eaa1-4706-b175-1955ac9078c5\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" Apr 24 22:38:12.520227 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.520176 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9542d4ef-eaa1-4706-b175-1955ac9078c5-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-45m6j\" (UID: \"9542d4ef-eaa1-4706-b175-1955ac9078c5\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" Apr 24 22:38:12.520480 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.520241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlp7h\" (UniqueName: \"kubernetes.io/projected/9542d4ef-eaa1-4706-b175-1955ac9078c5-kube-api-access-rlp7h\") pod \"seaweedfs-tls-custom-5c88b85bb7-45m6j\" (UID: \"9542d4ef-eaa1-4706-b175-1955ac9078c5\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" Apr 24 22:38:12.520725 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.520703 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9542d4ef-eaa1-4706-b175-1955ac9078c5-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-45m6j\" (UID: \"9542d4ef-eaa1-4706-b175-1955ac9078c5\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" Apr 24 22:38:12.522771 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.522755 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/9542d4ef-eaa1-4706-b175-1955ac9078c5-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-45m6j\" (UID: \"9542d4ef-eaa1-4706-b175-1955ac9078c5\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" Apr 24 22:38:12.528912 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.528889 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlp7h\" (UniqueName: \"kubernetes.io/projected/9542d4ef-eaa1-4706-b175-1955ac9078c5-kube-api-access-rlp7h\") pod \"seaweedfs-tls-custom-5c88b85bb7-45m6j\" (UID: \"9542d4ef-eaa1-4706-b175-1955ac9078c5\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" Apr 24 22:38:12.568097 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.568067 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e29ec13-8916-46b6-8de0-0183a1aacf8b" path="/var/lib/kubelet/pods/5e29ec13-8916-46b6-8de0-0183a1aacf8b/volumes" Apr 24 22:38:12.635504 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.635469 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" Apr 24 22:38:12.756779 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:12.756752 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j"] Apr 24 22:38:12.758976 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:38:12.758951 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9542d4ef_eaa1_4706_b175_1955ac9078c5.slice/crio-b0e9c993674a0b3bee89e9e5d72410eae4bf2e434610e0daacf0ec826bc05ee6 WatchSource:0}: Error finding container b0e9c993674a0b3bee89e9e5d72410eae4bf2e434610e0daacf0ec826bc05ee6: Status 404 returned error can't find the container with id b0e9c993674a0b3bee89e9e5d72410eae4bf2e434610e0daacf0ec826bc05ee6 Apr 24 22:38:13.262319 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:13.262234 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" event={"ID":"9542d4ef-eaa1-4706-b175-1955ac9078c5","Type":"ContainerStarted","Data":"9c60171dd544f5c127329648fd343d24c5ed3d7fc832c8776795fe83f70b3091"} Apr 24 22:38:13.262319 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:13.262268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" event={"ID":"9542d4ef-eaa1-4706-b175-1955ac9078c5","Type":"ContainerStarted","Data":"b0e9c993674a0b3bee89e9e5d72410eae4bf2e434610e0daacf0ec826bc05ee6"} Apr 24 22:38:13.281421 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:13.281362 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-45m6j" podStartSLOduration=1.03339017 podStartE2EDuration="1.281347675s" podCreationTimestamp="2026-04-24 22:38:12 +0000 UTC" firstStartedPulling="2026-04-24 22:38:12.760430824 +0000 UTC m=+484.751752713" lastFinishedPulling="2026-04-24 22:38:13.008388323 +0000 UTC m=+484.999710218" observedRunningTime="2026-04-24 22:38:13.278979041 +0000 UTC m=+485.270300940" watchObservedRunningTime="2026-04-24 22:38:13.281347675 +0000 UTC m=+485.272669575" Apr 24 22:38:21.114229 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.114193 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-djnzq"] Apr 24 22:38:21.117457 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.117441 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" Apr 24 22:38:21.120024 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.119992 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 24 22:38:21.120024 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.120020 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 24 22:38:21.126810 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.126785 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-djnzq"] Apr 24 22:38:21.197405 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.197346 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glbht\" (UniqueName: \"kubernetes.io/projected/9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8-kube-api-access-glbht\") pod \"seaweedfs-tls-serving-7fd5766db9-djnzq\" (UID: \"9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" Apr 24 22:38:21.197638 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.197463 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-djnzq\" (UID: \"9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" Apr 24 22:38:21.197638 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.197487 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8-data\") pod \"seaweedfs-tls-serving-7fd5766db9-djnzq\" (UID: \"9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" Apr 24 22:38:21.298935 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.298901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-djnzq\" (UID: \"9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" Apr 24 22:38:21.299097 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.298945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8-data\") pod \"seaweedfs-tls-serving-7fd5766db9-djnzq\" (UID: \"9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" Apr 24 22:38:21.299097 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.299033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glbht\" (UniqueName: \"kubernetes.io/projected/9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8-kube-api-access-glbht\") pod \"seaweedfs-tls-serving-7fd5766db9-djnzq\" (UID: \"9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" Apr 24 22:38:21.299406 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.299384 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8-data\") pod \"seaweedfs-tls-serving-7fd5766db9-djnzq\" (UID: \"9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" Apr 24 22:38:21.301370 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.301343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-djnzq\" (UID: \"9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" Apr 24 22:38:21.308733 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.308711 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glbht\" (UniqueName: \"kubernetes.io/projected/9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8-kube-api-access-glbht\") pod \"seaweedfs-tls-serving-7fd5766db9-djnzq\" (UID: \"9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" Apr 24 22:38:21.427849 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.427751 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" Apr 24 22:38:21.549082 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:21.549051 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-djnzq"] Apr 24 22:38:21.552106 ip-10-0-142-173 kubenswrapper[2573]: W0424 22:38:21.552077 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d4c6776_a863_4f6a_bfda_4ad9fb70b1c8.slice/crio-c44cae53ee8dad843cce436d29b2a26d828835da8daed77b9e817e8e081d026e WatchSource:0}: Error finding container c44cae53ee8dad843cce436d29b2a26d828835da8daed77b9e817e8e081d026e: Status 404 returned error can't find the container with id c44cae53ee8dad843cce436d29b2a26d828835da8daed77b9e817e8e081d026e Apr 24 22:38:22.295116 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:22.295061 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" event={"ID":"9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8","Type":"ContainerStarted","Data":"92dcaf9138d932fb18026225e6292c9febae067d56cbec271f593870d5ba9cfa"} Apr 24 22:38:22.295116 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:22.295120 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" event={"ID":"9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8","Type":"ContainerStarted","Data":"c44cae53ee8dad843cce436d29b2a26d828835da8daed77b9e817e8e081d026e"} Apr 24 22:38:22.312090 ip-10-0-142-173 kubenswrapper[2573]: I0424 22:38:22.312039 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-djnzq" podStartSLOduration=1.081506662 podStartE2EDuration="1.312023579s" podCreationTimestamp="2026-04-24 22:38:21 +0000 UTC" firstStartedPulling="2026-04-24 22:38:21.553494124 +0000 UTC m=+493.544816001" lastFinishedPulling="2026-04-24 22:38:21.78401104 +0000 UTC m=+493.775332918" observedRunningTime="2026-04-24 22:38:22.310117297 +0000 UTC m=+494.301439199" watchObservedRunningTime="2026-04-24 22:38:22.312023579 +0000 UTC m=+494.303345536" Apr 24 23:33:37.162567 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:37.162539 2573 ???:1] "http: TLS handshake error from 10.0.136.66:49328: EOF" Apr 24 23:33:37.167709 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:37.167689 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vp77h_69d0837a-7ff4-4b1d-ae6d-ab9e70350e7d/global-pull-secret-syncer/0.log" Apr 24 23:33:37.271928 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:37.271891 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tdfbz_5665ddf8-e176-403e-874a-d3f3d5a59d2e/konnectivity-agent/0.log" Apr 24 23:33:37.384694 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:37.384664 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-173.ec2.internal_840a721be2697b1c2b72dd11b149c26f/haproxy/0.log" Apr 24 23:33:40.351063 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.351030 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eeabef7e-b69f-4036-9683-5fa6a064923d/alertmanager/0.log" Apr 24 23:33:40.381145 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.381115 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eeabef7e-b69f-4036-9683-5fa6a064923d/config-reloader/0.log" Apr 24 23:33:40.406174 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.406149 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eeabef7e-b69f-4036-9683-5fa6a064923d/kube-rbac-proxy-web/0.log" Apr 24 23:33:40.432483 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.432460 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eeabef7e-b69f-4036-9683-5fa6a064923d/kube-rbac-proxy/0.log" Apr 24 23:33:40.460219 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.460193 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eeabef7e-b69f-4036-9683-5fa6a064923d/kube-rbac-proxy-metric/0.log" Apr 24 23:33:40.485426 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.485405 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eeabef7e-b69f-4036-9683-5fa6a064923d/prom-label-proxy/0.log" Apr 24 23:33:40.514327 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.514287 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_eeabef7e-b69f-4036-9683-5fa6a064923d/init-config-reloader/0.log" Apr 24 23:33:40.586899 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.586871 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-d84zz_1b8389f8-eb12-4e21-ada6-3f29e21b1aec/kube-state-metrics/0.log" Apr 24 23:33:40.610008 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.609940 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-d84zz_1b8389f8-eb12-4e21-ada6-3f29e21b1aec/kube-rbac-proxy-main/0.log" Apr 24 23:33:40.632956 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.632932 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-d84zz_1b8389f8-eb12-4e21-ada6-3f29e21b1aec/kube-rbac-proxy-self/0.log" Apr 24 23:33:40.726730 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.726700 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gcs4g_f4bd5f3e-34dc-4a5a-8885-40c15c8770b4/node-exporter/0.log" Apr 24 23:33:40.750316 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.750293 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gcs4g_f4bd5f3e-34dc-4a5a-8885-40c15c8770b4/kube-rbac-proxy/0.log" Apr 24 23:33:40.772848 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.772823 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gcs4g_f4bd5f3e-34dc-4a5a-8885-40c15c8770b4/init-textfile/0.log" Apr 24 23:33:40.949118 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.949041 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-czf56_1a74f7b3-39e3-4846-b087-d71b01aa3b59/kube-rbac-proxy-main/0.log" Apr 24 23:33:40.972354 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.972327 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-czf56_1a74f7b3-39e3-4846-b087-d71b01aa3b59/kube-rbac-proxy-self/0.log" Apr 24 23:33:40.995949 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:40.995923 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-czf56_1a74f7b3-39e3-4846-b087-d71b01aa3b59/openshift-state-metrics/0.log" Apr 24 23:33:41.200353 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:41.200265 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-txprq_45abc856-83f5-4340-bf02-0178cdf538ef/prometheus-operator/0.log" Apr 24 23:33:41.221556 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:41.221533 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-txprq_45abc856-83f5-4340-bf02-0178cdf538ef/kube-rbac-proxy/0.log" Apr 24 23:33:41.281369 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:41.281339 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64c75dbc66-jbgq8_c7d6e71b-fb2b-4986-afbf-f73e54eca75d/telemeter-client/0.log" Apr 24 23:33:41.304519 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:41.304493 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64c75dbc66-jbgq8_c7d6e71b-fb2b-4986-afbf-f73e54eca75d/reload/0.log" Apr 24 23:33:41.342901 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:41.342874 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64c75dbc66-jbgq8_c7d6e71b-fb2b-4986-afbf-f73e54eca75d/kube-rbac-proxy/0.log" Apr 24 23:33:43.513307 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:43.513238 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84bd6c94cf-5trg4_42ad3bc8-fb68-4b2c-9463-78d5df8c58e8/console/0.log" Apr 24 23:33:44.113012 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.112975 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6"] Apr 24 23:33:44.116694 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.116672 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.119278 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.119255 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rmwdq\"/\"openshift-service-ca.crt\"" Apr 24 23:33:44.120252 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.120233 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rmwdq\"/\"default-dockercfg-z2mgt\"" Apr 24 23:33:44.120355 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.120251 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rmwdq\"/\"kube-root-ca.crt\"" Apr 24 23:33:44.124414 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.124389 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6"] Apr 24 23:33:44.180396 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.180367 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-sys\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.180581 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.180424 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-podres\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.180581 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.180452 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-proc\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.180581 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.180492 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-lib-modules\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.180581 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.180562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbkvv\" (UniqueName: \"kubernetes.io/projected/25b5a6bc-de95-4593-a8fe-15982f145fc1-kube-api-access-xbkvv\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.281871 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.281837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbkvv\" (UniqueName: \"kubernetes.io/projected/25b5a6bc-de95-4593-a8fe-15982f145fc1-kube-api-access-xbkvv\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.282039 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.281910 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-sys\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.282039 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.281941 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-podres\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.282039 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.281966 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-proc\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.282039 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.281996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-lib-modules\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.282198 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.282039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-sys\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.282198 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.282068 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-proc\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.282198 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.282092 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-podres\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.282198 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.282118 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/25b5a6bc-de95-4593-a8fe-15982f145fc1-lib-modules\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.290214 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.290190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbkvv\" (UniqueName: \"kubernetes.io/projected/25b5a6bc-de95-4593-a8fe-15982f145fc1-kube-api-access-xbkvv\") pod \"perf-node-gather-daemonset-ljdl6\" (UID: \"25b5a6bc-de95-4593-a8fe-15982f145fc1\") " pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.427878 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.427786 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:44.545889 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.545838 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6"] Apr 24 23:33:44.548389 ip-10-0-142-173 kubenswrapper[2573]: W0424 23:33:44.548364 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod25b5a6bc_de95_4593_a8fe_15982f145fc1.slice/crio-fdc003041ad1e2cee7f66041ee6edd912857c8fe18f3af36f5a7a5294d4ebc52 WatchSource:0}: Error finding container fdc003041ad1e2cee7f66041ee6edd912857c8fe18f3af36f5a7a5294d4ebc52: Status 404 returned error can't find the container with id fdc003041ad1e2cee7f66041ee6edd912857c8fe18f3af36f5a7a5294d4ebc52 Apr 24 23:33:44.549946 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.549923 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:33:44.635998 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.635969 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4btgp_a1295445-89e1-4b74-af4a-124c7863b64d/dns/0.log" Apr 24 23:33:44.656514 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.656490 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4btgp_a1295445-89e1-4b74-af4a-124c7863b64d/kube-rbac-proxy/0.log" Apr 24 23:33:44.789533 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:44.789507 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b7xgj_adea4cfb-400a-43d8-8b2d-0cd0d88160f5/dns-node-resolver/0.log" Apr 24 23:33:45.249232 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:45.249205 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9lblg_1a9aa9ab-1597-4fc6-8210-93b838855a27/node-ca/0.log" Apr 24 23:33:45.257292 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:45.257262 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" event={"ID":"25b5a6bc-de95-4593-a8fe-15982f145fc1","Type":"ContainerStarted","Data":"3c9e2a5eb652df7a990b139c4d023903a6908215ef02d4a4a65975e9c34782e0"} Apr 24 23:33:45.257441 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:45.257300 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" event={"ID":"25b5a6bc-de95-4593-a8fe-15982f145fc1","Type":"ContainerStarted","Data":"fdc003041ad1e2cee7f66041ee6edd912857c8fe18f3af36f5a7a5294d4ebc52"} Apr 24 23:33:45.257441 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:45.257332 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:45.273720 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:45.273677 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" podStartSLOduration=1.273658562 podStartE2EDuration="1.273658562s" podCreationTimestamp="2026-04-24 23:33:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:33:45.271694821 +0000 UTC m=+3817.263016721" watchObservedRunningTime="2026-04-24 23:33:45.273658562 +0000 UTC m=+3817.264980463" Apr 24 23:33:46.301662 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:46.301636 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rcpmc_762fb5cd-2743-4739-8c26-fe80bd1dcb02/serve-healthcheck-canary/0.log" Apr 24 23:33:46.800757 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:46.800725 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n6js7_985f830c-629d-4d9c-aa01-fae79c3e683a/kube-rbac-proxy/0.log" Apr 24 23:33:46.821757 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:46.821726 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n6js7_985f830c-629d-4d9c-aa01-fae79c3e683a/exporter/0.log" Apr 24 23:33:46.842913 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:46.842886 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n6js7_985f830c-629d-4d9c-aa01-fae79c3e683a/extractor/0.log" Apr 24 23:33:48.867860 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:48.867829 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-549bc44c6d-l7h8w_13de8ff4-3f03-4cf2-9d1c-ea9f64ace207/manager/0.log" Apr 24 23:33:48.887174 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:48.887145 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-2zqjk_d4526e70-49e7-4c60-ac11-447a12b62f19/manager/0.log" Apr 24 23:33:48.910078 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:48.910057 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-4dt9g_06e263d9-56d4-4186-a8a3-aed4cbc1c175/server/0.log" Apr 24 23:33:49.125905 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:49.125829 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-4ctx7_0ef77a69-67af-4e1f-b012-5390274c2187/manager/0.log" Apr 24 23:33:49.236281 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:49.236255 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-45m6j_9542d4ef-eaa1-4706-b175-1955ac9078c5/seaweedfs-tls-custom/0.log" Apr 24 23:33:49.257809 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:49.257784 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-djnzq_9d4c6776-a863-4f6a-bfda-4ad9fb70b1c8/seaweedfs-tls-serving/0.log" Apr 24 23:33:51.270992 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:51.270967 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-rmwdq/perf-node-gather-daemonset-ljdl6" Apr 24 23:33:54.304680 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:54.304644 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7v756_d413d1a6-f8ca-40a5-90ec-78dff39daaf1/kube-multus/0.log" Apr 24 23:33:54.713631 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:54.713536 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5d5_5b1b5f17-6df7-4280-b50e-f0241d9ab7d6/kube-multus-additional-cni-plugins/0.log" Apr 24 23:33:54.764290 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:54.764264 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5d5_5b1b5f17-6df7-4280-b50e-f0241d9ab7d6/egress-router-binary-copy/0.log" Apr 24 23:33:54.806656 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:54.806631 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5d5_5b1b5f17-6df7-4280-b50e-f0241d9ab7d6/cni-plugins/0.log" Apr 24 23:33:54.831940 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:54.831920 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5d5_5b1b5f17-6df7-4280-b50e-f0241d9ab7d6/bond-cni-plugin/0.log" Apr 24 23:33:54.854323 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:54.854304 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5d5_5b1b5f17-6df7-4280-b50e-f0241d9ab7d6/routeoverride-cni/0.log" Apr 24 23:33:54.879745 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:54.879714 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5d5_5b1b5f17-6df7-4280-b50e-f0241d9ab7d6/whereabouts-cni-bincopy/0.log" Apr 24 23:33:54.901767 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:54.901739 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mq5d5_5b1b5f17-6df7-4280-b50e-f0241d9ab7d6/whereabouts-cni/0.log" Apr 24 23:33:55.002492 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:55.002471 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kdqw9_b114ecc3-3191-4768-a2bc-d878a4044ee3/network-metrics-daemon/0.log" Apr 24 23:33:55.023104 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:55.023082 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kdqw9_b114ecc3-3191-4768-a2bc-d878a4044ee3/kube-rbac-proxy/0.log" Apr 24 23:33:55.878715 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:55.878644 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckgcl_f179e6c5-8a33-48d8-96ce-1400a4dcde57/ovn-controller/0.log" Apr 24 23:33:55.928006 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:55.927982 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckgcl_f179e6c5-8a33-48d8-96ce-1400a4dcde57/ovn-acl-logging/0.log" Apr 24 23:33:55.947822 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:55.947798 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckgcl_f179e6c5-8a33-48d8-96ce-1400a4dcde57/kube-rbac-proxy-node/0.log" Apr 24 23:33:55.973060 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:55.973038 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckgcl_f179e6c5-8a33-48d8-96ce-1400a4dcde57/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 23:33:55.997480 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:55.997456 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckgcl_f179e6c5-8a33-48d8-96ce-1400a4dcde57/northd/0.log" Apr 24 23:33:56.022009 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:56.021992 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckgcl_f179e6c5-8a33-48d8-96ce-1400a4dcde57/nbdb/0.log" Apr 24 23:33:56.049101 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:56.049083 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckgcl_f179e6c5-8a33-48d8-96ce-1400a4dcde57/sbdb/0.log" Apr 24 23:33:56.155307 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:56.155222 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ckgcl_f179e6c5-8a33-48d8-96ce-1400a4dcde57/ovnkube-controller/0.log" Apr 24 23:33:57.745096 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:57.745065 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hqprm_1c5e89d8-8a8e-41eb-a725-89b55ae5ed48/network-check-target-container/0.log" Apr 24 23:33:58.664290 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:58.664259 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-fk6tq_f09ea6fe-aff1-4e92-a7ca-c70f50d186ec/iptables-alerter/0.log" Apr 24 23:33:59.295346 ip-10-0-142-173 kubenswrapper[2573]: I0424 23:33:59.295317 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-brsxm_a75bb813-f2e4-4f8e-a0e9-677e2345d5f2/tuned/0.log"