Apr 23 14:57:19.379016 ip-10-0-141-16 systemd[1]: Starting Kubernetes Kubelet... Apr 23 14:57:19.820056 ip-10-0-141-16 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 14:57:19.820056 ip-10-0-141-16 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 14:57:19.820056 ip-10-0-141-16 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 14:57:19.820056 ip-10-0-141-16 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 14:57:19.820056 ip-10-0-141-16 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 14:57:19.821503 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.821419 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 14:57:19.824282 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824265 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 14:57:19.824282 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824281 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824286 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824290 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824293 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824296 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824300 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824302 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824305 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824308 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824311 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824313 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824316 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824320 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824323 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824326 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824329 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824331 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824334 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824336 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824339 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 14:57:19.824352 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824341 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824344 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824346 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824348 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824351 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824353 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824356 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824359 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824362 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824365 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824367 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824371 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824375 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824378 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824381 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824383 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824386 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824389 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824392 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824394 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 14:57:19.824852 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824397 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824400 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824402 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824405 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824409 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824413 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824416 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824419 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824421 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824425 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824427 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824429 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824432 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824434 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824438 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824441 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824443 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824446 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824449 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 14:57:19.825335 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824452 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824454 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824457 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824460 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824462 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824465 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824468 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824470 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824473 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824476 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824478 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824481 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824483 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824486 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824489 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824492 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824495 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824497 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824500 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824503 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 14:57:19.825822 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824505 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 14:57:19.826306 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824508 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 14:57:19.826306 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824510 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 14:57:19.826306 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824513 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 14:57:19.826306 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824515 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 14:57:19.826306 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.824518 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826349 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826357 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826360 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826363 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826367 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826369 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826372 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826375 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826378 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826381 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826384 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826388 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826391 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826394 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826396 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826399 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826402 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826405 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 14:57:19.826426 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826407 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826410 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826413 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826415 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826418 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826420 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826423 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826427 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826431 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826434 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826438 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826440 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826443 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826446 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826449 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826451 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826454 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826456 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826459 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 14:57:19.827002 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826461 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826464 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826466 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826469 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826471 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826474 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826477 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826480 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826482 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826485 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826495 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826498 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826500 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826502 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826505 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826508 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826512 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826514 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826517 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826519 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 14:57:19.827471 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826522 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826525 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826528 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826530 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826533 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826536 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826538 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826540 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826543 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826545 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826548 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826551 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826553 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826556 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826558 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826561 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826564 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826567 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826570 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826573 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 14:57:19.827974 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826575 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826578 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826580 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826583 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826586 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826588 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826591 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826594 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.826597 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826666 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826673 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826680 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826687 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826710 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826756 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826819 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826826 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826832 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826837 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826843 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826849 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826854 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826860 2577 flags.go:64] FLAG: --cgroup-root="" Apr 23 14:57:19.828570 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826870 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826876 2577 flags.go:64] FLAG: --client-ca-file="" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.826881 2577 flags.go:64] FLAG: --cloud-config="" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828256 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828277 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828288 2577 flags.go:64] FLAG: --cluster-domain="" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828295 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828301 2577 flags.go:64] FLAG: --config-dir="" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828307 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828315 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828358 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828369 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828375 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828381 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828386 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828935 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828945 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828950 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828954 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828959 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828963 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828966 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828969 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828972 2577 flags.go:64] FLAG: --enable-server="true" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828975 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 14:57:19.829287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828981 2577 flags.go:64] FLAG: --event-burst="100" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828984 2577 flags.go:64] FLAG: --event-qps="50" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828988 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828991 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828994 2577 flags.go:64] FLAG: --eviction-hard="" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.828998 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829001 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829004 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829007 2577 flags.go:64] FLAG: --eviction-soft="" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829010 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829013 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829016 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829019 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829022 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829025 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829028 2577 flags.go:64] FLAG: --feature-gates="" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829033 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829039 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829043 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829046 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829049 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829053 2577 flags.go:64] FLAG: --help="false" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829056 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-141-16.ec2.internal" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829059 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 14:57:19.829904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829063 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829066 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829070 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829074 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829077 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829079 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829082 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829085 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829089 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829093 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829095 2577 flags.go:64] FLAG: --kube-reserved="" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829098 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829101 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829104 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829107 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829110 2577 flags.go:64] FLAG: --lock-file="" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829113 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829116 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829119 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829125 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829128 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829131 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829135 2577 flags.go:64] FLAG: --logging-format="text" Apr 23 14:57:19.830488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829138 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829142 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829145 2577 flags.go:64] FLAG: --manifest-url="" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829148 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829153 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829156 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829160 2577 flags.go:64] FLAG: --max-pods="110" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829163 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829167 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829170 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829173 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829176 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829179 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829182 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829190 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829193 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829196 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829199 2577 flags.go:64] FLAG: --pod-cidr="" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829202 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829208 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829211 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829214 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829217 2577 flags.go:64] FLAG: --port="10250" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829220 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 14:57:19.831074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829223 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-044a1a00c70ee5b4b" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829227 2577 flags.go:64] FLAG: --qos-reserved="" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829230 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829232 2577 flags.go:64] FLAG: --register-node="true" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829235 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829238 2577 flags.go:64] FLAG: --register-with-taints="" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829242 2577 flags.go:64] FLAG: --registry-burst="10" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829245 2577 flags.go:64] FLAG: --registry-qps="5" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829248 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829252 2577 flags.go:64] FLAG: --reserved-memory="" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829255 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829258 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829262 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829265 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829268 2577 flags.go:64] FLAG: --runonce="false" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829271 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829274 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829277 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829279 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829282 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829285 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829289 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829292 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829295 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829298 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829301 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 14:57:19.831689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829304 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829307 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829310 2577 flags.go:64] FLAG: --system-cgroups="" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829313 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829319 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829322 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829325 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829330 2577 flags.go:64] FLAG: --tls-min-version="" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829333 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829336 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829339 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829342 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829345 2577 flags.go:64] FLAG: --v="2" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829350 2577 flags.go:64] FLAG: --version="false" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829357 2577 flags.go:64] FLAG: --vmodule="" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829362 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.829365 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829465 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829470 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829474 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829477 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829480 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829483 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 14:57:19.832327 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829486 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829489 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829492 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829495 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829497 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829500 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829502 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829505 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829508 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829511 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829513 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829516 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829518 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829521 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829523 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829526 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829528 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829531 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829533 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829536 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 14:57:19.832933 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829538 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829541 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829544 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829549 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829552 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829554 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829557 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829559 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829562 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829564 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829567 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829569 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829572 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829574 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829577 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829579 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829582 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829585 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829587 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829590 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 14:57:19.833480 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829592 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829595 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829599 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829602 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829604 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829607 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829610 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829612 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829615 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829618 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829621 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829623 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829626 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829629 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829631 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829635 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829639 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829641 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829644 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829647 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 14:57:19.834000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829649 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829651 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829654 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829657 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829659 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829662 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829664 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829666 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829669 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829671 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829674 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829676 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829679 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829681 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829684 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829688 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829692 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829712 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829717 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 14:57:19.834506 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.829722 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 14:57:19.834981 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.830263 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 14:57:19.836616 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.836595 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 14:57:19.836616 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.836616 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 14:57:19.836689 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836665 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 14:57:19.836689 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836670 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 14:57:19.836689 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836674 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 14:57:19.836689 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836677 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 14:57:19.836689 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836682 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 14:57:19.836689 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836687 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 14:57:19.836689 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836690 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836708 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836711 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836714 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836718 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836721 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836724 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836726 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836729 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836732 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836735 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836737 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836741 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836743 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836746 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836748 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836751 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836754 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836756 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836759 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 14:57:19.836921 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836761 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836764 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836767 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836769 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836772 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836775 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836777 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836780 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836784 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836788 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836791 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836794 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836798 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836801 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836804 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836808 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836810 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836813 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836816 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 14:57:19.837416 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836819 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836822 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836824 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836827 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836830 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836832 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836835 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836838 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836840 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836843 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836845 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836848 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836850 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836853 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836855 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836858 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836860 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836863 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836865 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836868 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 14:57:19.837892 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836871 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836873 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836876 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836879 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836881 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836884 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836887 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836891 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836894 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836896 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836899 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836902 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836904 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836907 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836910 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836913 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836916 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836918 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836921 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836924 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 14:57:19.838372 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.836927 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.836932 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837031 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837035 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837038 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837041 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837045 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837047 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837050 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837053 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837055 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837058 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837061 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837064 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837067 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837069 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 14:57:19.838894 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837072 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837074 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837077 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837080 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837083 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837086 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837088 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837091 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837093 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837096 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837099 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837101 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837104 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837106 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837109 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837111 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837114 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837116 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837119 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837122 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 14:57:19.839294 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837124 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837127 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837129 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837132 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837134 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837136 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837139 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837141 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837144 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837147 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837151 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837155 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837157 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837160 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837162 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837166 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837169 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837171 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837174 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837176 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 14:57:19.839833 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837179 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837181 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837185 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837189 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837192 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837195 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837198 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837201 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837204 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837206 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837209 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837212 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837214 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837217 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837219 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837222 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837225 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837227 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837230 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 14:57:19.840309 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837233 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837235 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837238 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837241 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837243 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837246 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837249 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837251 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837254 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837256 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837259 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837262 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:19.837265 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.837270 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 14:57:19.840787 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.837992 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 14:57:19.842467 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.842330 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 14:57:19.843527 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.843517 2577 server.go:1019] "Starting client certificate rotation" Apr 23 14:57:19.843630 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.843611 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 14:57:19.843667 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.843657 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 14:57:19.868358 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.868335 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 14:57:19.872796 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.872770 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 14:57:19.886552 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.886530 2577 log.go:25] "Validated CRI v1 runtime API" Apr 23 14:57:19.893328 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.893307 2577 log.go:25] "Validated CRI v1 image API" Apr 23 14:57:19.894469 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.894443 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 14:57:19.897456 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.897437 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 14:57:19.898944 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.898923 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9c7a2073-c3d7-4c77-8c93-f8d8bf1cf3db:/dev/nvme0n1p4 ef4f15f5-07b1-4bbe-9fde-cac262aece29:/dev/nvme0n1p3] Apr 23 14:57:19.899036 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.898947 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 14:57:19.905306 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.905181 2577 manager.go:217] Machine: {Timestamp:2026-04-23 14:57:19.902739629 +0000 UTC m=+0.409822009 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101427 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24a991a114cc11bc9c31e767e9269c SystemUUID:ec24a991-a114-cc11-bc9c-31e767e9269c BootID:655c07f3-5579-47dd-a549-182ee481f5f0 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c6:87:6d:fe:17 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c6:87:6d:fe:17 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:98:c3:00:b0:ec Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 14:57:19.905306 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.905291 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 14:57:19.905492 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.905414 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 14:57:19.907656 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.907631 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 14:57:19.907879 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.907657 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-16.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 14:57:19.907968 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.907893 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 14:57:19.907968 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.907907 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 14:57:19.907968 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.907932 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 14:57:19.909512 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.909496 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 14:57:19.910744 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.910732 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 23 14:57:19.910878 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.910867 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 14:57:19.913153 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.913140 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 23 14:57:19.913221 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.913161 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 14:57:19.913221 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.913178 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 14:57:19.913221 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.913194 2577 kubelet.go:397] "Adding apiserver pod source" Apr 23 14:57:19.913221 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.913206 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 14:57:19.914250 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.914236 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 14:57:19.914336 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.914260 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 14:57:19.917299 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.917285 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 14:57:19.919041 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.919019 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fgmm8" Apr 23 14:57:19.919125 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.919100 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 14:57:19.920345 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.920330 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 14:57:19.920419 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.920352 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 14:57:19.920419 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.920361 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 14:57:19.920419 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.920368 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 14:57:19.920419 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.920377 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 14:57:19.920961 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.920854 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 14:57:19.921026 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.920985 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 14:57:19.921026 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.920998 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 14:57:19.921026 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.921013 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 14:57:19.921026 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.921023 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 14:57:19.921189 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.921038 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 14:57:19.921189 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.921056 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 14:57:19.921908 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.921888 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 14:57:19.921975 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.921914 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 14:57:19.924692 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.924654 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-16.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 14:57:19.924802 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:19.924732 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-16.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 14:57:19.924863 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:19.924800 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 14:57:19.925741 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.925722 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 14:57:19.925821 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.925774 2577 server.go:1295] "Started kubelet" Apr 23 14:57:19.925909 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.925860 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 14:57:19.925945 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.925933 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 14:57:19.926538 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.925892 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 14:57:19.926663 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.926639 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fgmm8" Apr 23 14:57:19.926986 ip-10-0-141-16 systemd[1]: Started Kubernetes Kubelet. Apr 23 14:57:19.927807 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.927780 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 14:57:19.930026 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.930005 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 23 14:57:19.933748 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.933731 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 14:57:19.933834 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.933755 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 14:57:19.934039 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:19.932995 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-16.ec2.internal.18a90449162ffafe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-16.ec2.internal,UID:ip-10-0-141-16.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-16.ec2.internal,},FirstTimestamp:2026-04-23 14:57:19.925738238 +0000 UTC m=+0.432820618,LastTimestamp:2026-04-23 14:57:19.925738238 +0000 UTC m=+0.432820618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-16.ec2.internal,}" Apr 23 14:57:19.934438 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934421 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 14:57:19.934510 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934442 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 14:57:19.934510 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934421 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 14:57:19.934613 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934532 2577 factory.go:55] Registering systemd factory Apr 23 14:57:19.934613 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:19.934549 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 23 14:57:19.934613 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934555 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 23 14:57:19.934613 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934571 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 23 14:57:19.934613 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934587 2577 factory.go:223] Registration of the systemd container factory successfully Apr 23 14:57:19.934838 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934824 2577 factory.go:153] Registering CRI-O factory Apr 23 14:57:19.934895 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934841 2577 factory.go:223] Registration of the crio container factory successfully Apr 23 14:57:19.934945 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934898 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 14:57:19.934945 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934917 2577 factory.go:103] Registering Raw factory Apr 23 14:57:19.934945 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.934931 2577 manager.go:1196] Started watching for new ooms in manager Apr 23 14:57:19.935347 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.935333 2577 manager.go:319] Starting recovery of all containers Apr 23 14:57:19.945205 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.945178 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:57:19.946240 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.946217 2577 manager.go:324] Recovery completed Apr 23 14:57:19.948170 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:19.948142 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-16.ec2.internal\" not found" node="ip-10-0-141-16.ec2.internal" Apr 23 14:57:19.953830 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.953815 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:57:19.956074 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.956059 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:57:19.956147 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.956090 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:57:19.956147 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.956102 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:57:19.956549 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.956535 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 14:57:19.956607 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.956549 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 14:57:19.956607 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.956567 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 23 14:57:19.958566 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.958554 2577 policy_none.go:49] "None policy: Start" Apr 23 14:57:19.958615 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.958570 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 14:57:19.958615 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:19.958581 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 23 14:57:20.006402 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.006382 2577 manager.go:341] "Starting Device Plugin manager" Apr 23 14:57:20.022689 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.006417 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 14:57:20.022689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.006429 2577 server.go:85] "Starting device plugin registration server" Apr 23 14:57:20.022689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.006715 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 14:57:20.022689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.006735 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 14:57:20.022689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.006862 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 14:57:20.022689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.006986 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 14:57:20.022689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.006997 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 14:57:20.022689 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.007739 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 14:57:20.022689 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.007790 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-16.ec2.internal\" not found" Apr 23 14:57:20.087999 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.087911 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 14:57:20.089205 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.089187 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 14:57:20.089252 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.089219 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 14:57:20.089252 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.089240 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 14:57:20.089252 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.089246 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 14:57:20.089377 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.089330 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 14:57:20.093092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.093074 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:57:20.106975 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.106959 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:57:20.108152 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.108129 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:57:20.108267 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.108164 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:57:20.108267 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.108181 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:57:20.108267 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.108205 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.117619 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.117602 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.117686 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.117626 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-16.ec2.internal\": node \"ip-10-0-141-16.ec2.internal\" not found" Apr 23 14:57:20.141268 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.141241 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 23 14:57:20.189774 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.189721 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal"] Apr 23 14:57:20.189925 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.189835 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:57:20.190823 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.190808 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:57:20.190903 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.190842 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:57:20.190903 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.190856 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:57:20.192076 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.192059 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:57:20.192243 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.192226 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.192294 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.192259 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:57:20.192802 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.192787 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:57:20.192857 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.192792 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:57:20.192857 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.192835 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:57:20.192857 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.192849 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:57:20.192950 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.192817 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:57:20.192950 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.192903 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:57:20.193828 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.193814 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.193870 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.193842 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 14:57:20.194538 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.194523 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 23 14:57:20.194584 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.194554 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 14:57:20.194584 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.194569 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 23 14:57:20.219549 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.219528 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-16.ec2.internal\" not found" node="ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.224230 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.224213 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-16.ec2.internal\" not found" node="ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.235347 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.235325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/237efac7542ae805317afa8331e5e27b-config\") pod \"kube-apiserver-proxy-ip-10-0-141-16.ec2.internal\" (UID: \"237efac7542ae805317afa8331e5e27b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.235452 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.235351 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.235452 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.235369 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.241827 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.241804 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 23 14:57:20.336271 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.336237 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.336271 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.336195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.336459 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.336304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.336459 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.336325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/237efac7542ae805317afa8331e5e27b-config\") pod \"kube-apiserver-proxy-ip-10-0-141-16.ec2.internal\" (UID: \"237efac7542ae805317afa8331e5e27b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.336459 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.336361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/237efac7542ae805317afa8331e5e27b-config\") pod \"kube-apiserver-proxy-ip-10-0-141-16.ec2.internal\" (UID: \"237efac7542ae805317afa8331e5e27b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.336459 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.336385 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.342334 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.342285 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 23 14:57:20.442547 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.442506 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 23 14:57:20.521764 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.521731 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.526414 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.526397 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.543642 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.543616 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 23 14:57:20.644173 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.644136 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 23 14:57:20.744742 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.744692 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 23 14:57:20.770413 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.770391 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:57:20.834789 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.834760 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.843271 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.843252 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 14:57:20.843394 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.843365 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 14:57:20.843519 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.843404 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 14:57:20.843519 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.843411 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 14:57:20.843519 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.843452 2577 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://af61c59681f4b4780853142ec2d28b02-ae7a07c23c44a6cd.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.141.16:60780->44.196.152.103:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.843519 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.843479 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 23 14:57:20.859865 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.859841 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 14:57:20.914264 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.914200 2577 apiserver.go:52] "Watching apiserver" Apr 23 14:57:20.923739 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.923165 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 14:57:20.925206 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.925175 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal","openshift-cluster-node-tuning-operator/tuned-nhrx7","openshift-image-registry/node-ca-tg5n4","openshift-network-diagnostics/network-check-target-6rls8","openshift-ovn-kubernetes/ovnkube-node-fpxb2","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc","openshift-dns/node-resolver-2hv6m","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal","openshift-multus/multus-additional-cni-plugins-jhxf7","openshift-multus/multus-w69vk","openshift-multus/network-metrics-daemon-bjpzv","openshift-network-operator/iptables-alerter-k777l","kube-system/konnectivity-agent-6ttdv"] Apr 23 14:57:20.926801 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.926781 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.928519 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.928496 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tg5n4" Apr 23 14:57:20.929313 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.929295 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 14:57:20.929474 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.929450 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:20.929545 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.929514 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:20.929545 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.929525 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pwgwd\"" Apr 23 14:57:20.929672 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.929657 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 14:57:20.929672 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.929658 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 14:52:19 +0000 UTC" deadline="2027-11-29 18:10:54.670894632 +0000 UTC" Apr 23 14:57:20.929784 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.929677 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14043h13m33.741219401s" Apr 23 14:57:20.930423 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.930408 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.930483 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.930472 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:20.930935 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.930916 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zvrrj\"" Apr 23 14:57:20.931030 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.930918 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 14:57:20.931030 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.930963 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 14:57:20.931477 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.931464 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2hv6m" Apr 23 14:57:20.932160 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.932138 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 14:57:20.932296 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.932283 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 14:57:20.932628 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.932615 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 14:57:20.932675 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.932638 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:20.933775 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.933760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.933932 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.933916 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 14:57:20.934170 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.934157 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 14:57:20.934408 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.934392 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-m88kl\"" Apr 23 14:57:20.934860 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.934847 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:20.934921 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:20.934906 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:20.935989 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.935972 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k777l" Apr 23 14:57:20.936042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.936031 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 14:57:20.936309 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.936291 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 14:57:20.936390 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.936360 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 14:57:20.936726 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.936710 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 14:57:20.936805 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.936729 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 14:57:20.936862 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.936824 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 14:57:20.936911 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.936870 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:20.936963 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.936935 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4qssr\"" Apr 23 14:57:20.937086 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.937062 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pv5wb\"" Apr 23 14:57:20.937179 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.937136 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 14:57:20.937179 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.937148 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-6f2v7\"" Apr 23 14:57:20.937179 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.937143 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 14:57:20.937326 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.937182 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 14:57:20.937381 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.937339 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 14:57:20.937533 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.937517 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 14:57:20.937603 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.937592 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 14:57:20.937735 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.937720 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 14:57:20.937924 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.937906 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ndglh\"" Apr 23 14:57:20.937983 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.937952 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 14:57:20.938190 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-kubernetes\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.938295 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938219 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-sysctl-d\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.938295 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-tuned\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.938295 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:20.938295 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938287 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 14:57:20.938459 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938290 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 14:57:20.938459 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-sysctl-conf\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.938459 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-device-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:20.938459 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-run-k8s-cni-cncf-io\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.938647 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938474 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-run-multus-certs\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.938647 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/018b5460-f98e-40c3-a50c-c43ca05fa0ef-cni-binary-copy\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.938647 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938554 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.938818 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938668 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1399ad28-3a6d-4e8a-9154-bb0eafc7e101-tmp-dir\") pod \"node-resolver-2hv6m\" (UID: \"1399ad28-3a6d-4e8a-9154-bb0eafc7e101\") " pod="openshift-dns/node-resolver-2hv6m" Apr 23 14:57:20.938818 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938716 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-system-cni-dir\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:20.938818 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938766 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-node-log\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.938818 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-var-lib-kubelet\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.938818 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-cnibin\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:20.939244 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.938835 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/82866f79-d634-4792-b3fd-ef2753feb90f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:20.939394 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.939379 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 14:57:20.939394 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.939297 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm97q\" (UniqueName: \"kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q\") pod \"network-check-target-6rls8\" (UID: \"3236e428-70b1-4400-9f33-348489a945df\") " pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:20.939573 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.939541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-slash\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.939692 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.939675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-cni-netd\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.939813 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.939730 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-modprobe-d\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.939813 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.939754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-sysconfig\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.939909 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.939882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-systemd\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.939956 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.939939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-socket-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:20.940036 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.939989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-sys-fs\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:20.940088 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-system-cni-dir\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.940180 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-os-release\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.940249 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b32c0a28-43de-408e-beab-64a0a9ff6ac3-serviceca\") pod \"node-ca-tg5n4\" (UID: \"b32c0a28-43de-408e-beab-64a0a9ff6ac3\") " pod="openshift-image-registry/node-ca-tg5n4" Apr 23 14:57:20.940299 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940257 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 14:57:20.940299 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsl4q\" (UniqueName: \"kubernetes.io/projected/1399ad28-3a6d-4e8a-9154-bb0eafc7e101-kube-api-access-nsl4q\") pod \"node-resolver-2hv6m\" (UID: \"1399ad28-3a6d-4e8a-9154-bb0eafc7e101\") " pod="openshift-dns/node-resolver-2hv6m" Apr 23 14:57:20.940389 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940315 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/82866f79-d634-4792-b3fd-ef2753feb90f-cni-binary-copy\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:20.940389 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940334 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 14:57:20.940491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940472 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-4dt8n\"" Apr 23 14:57:20.940559 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940539 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp8b4\" (UniqueName: \"kubernetes.io/projected/82866f79-d634-4792-b3fd-ef2753feb90f-kube-api-access-zp8b4\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:20.940643 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940237 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-26p67\"" Apr 23 14:57:20.940854 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-var-lib-cni-bin\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.940949 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940930 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prxfj\" (UniqueName: \"kubernetes.io/projected/018b5460-f98e-40c3-a50c-c43ca05fa0ef-kube-api-access-prxfj\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.941001 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.940988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-kubelet\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.941111 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.941090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4d459d1-0077-45ce-b190-9cc485dd9ae5-tmp\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.941201 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.941140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43741ca5-8515-4e77-be0b-52d3495c2460-env-overrides\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.941476 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.941428 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sv98\" (UniqueName: \"kubernetes.io/projected/43741ca5-8515-4e77-be0b-52d3495c2460-kube-api-access-2sv98\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.941535 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.941501 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b55hg\" (UniqueName: \"kubernetes.io/projected/e4d459d1-0077-45ce-b190-9cc485dd9ae5-kube-api-access-b55hg\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.941614 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.941595 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t5lf\" (UniqueName: \"kubernetes.io/projected/591b1876-3766-4f35-9533-34a2ea6f684e-kube-api-access-9t5lf\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:20.941789 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.941768 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-var-lib-cni-multus\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.941885 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.941834 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-etc-openvswitch\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.941951 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.941918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-sys\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.942007 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.941989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-cnibin\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.942092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.942071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-run-openvswitch\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.942213 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.942173 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-run-ovn\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.942262 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.942229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:20.942385 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.942358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-os-release\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:20.942449 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.942431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-run-netns\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.942512 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.942490 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-etc-kubernetes\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.942576 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.942545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l8bl\" (UniqueName: \"kubernetes.io/projected/b32c0a28-43de-408e-beab-64a0a9ff6ac3-kube-api-access-6l8bl\") pod \"node-ca-tg5n4\" (UID: \"b32c0a28-43de-408e-beab-64a0a9ff6ac3\") " pod="openshift-image-registry/node-ca-tg5n4" Apr 23 14:57:20.942691 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.942640 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-run-netns\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.943044 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-registration-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:20.943122 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943073 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-etc-selinux\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:20.943122 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943112 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-cni-dir\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.943228 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943138 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-hostroot\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.943228 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943175 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-conf-dir\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.943228 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943208 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-daemon-config\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.943369 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-log-socket\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.943369 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-var-lib-kubelet\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.943458 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943361 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-lib-modules\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.943458 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-host\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.943554 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943524 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1399ad28-3a6d-4e8a-9154-bb0eafc7e101-hosts-file\") pod \"node-resolver-2hv6m\" (UID: \"1399ad28-3a6d-4e8a-9154-bb0eafc7e101\") " pod="openshift-dns/node-resolver-2hv6m" Apr 23 14:57:20.943605 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-socket-dir-parent\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:20.943661 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-run-systemd\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.943726 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943665 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43741ca5-8515-4e77-be0b-52d3495c2460-ovnkube-config\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.943784 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943732 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-run\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:20.943834 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:20.943876 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/82866f79-d634-4792-b3fd-ef2753feb90f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:20.943977 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-var-lib-openvswitch\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.944056 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.943995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43741ca5-8515-4e77-be0b-52d3495c2460-ovn-node-metrics-cert\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.944056 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.944033 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43741ca5-8515-4e77-be0b-52d3495c2460-ovnkube-script-lib\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.944155 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.944111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b32c0a28-43de-408e-beab-64a0a9ff6ac3-host\") pod \"node-ca-tg5n4\" (UID: \"b32c0a28-43de-408e-beab-64a0a9ff6ac3\") " pod="openshift-image-registry/node-ca-tg5n4" Apr 23 14:57:20.944451 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.944429 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-systemd-units\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.944527 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.944470 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-run-ovn-kubernetes\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.944527 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.944489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-cni-bin\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:20.944527 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.944506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx2p7\" (UniqueName: \"kubernetes.io/projected/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-kube-api-access-mx2p7\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:20.950830 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.950811 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 14:57:20.985180 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.985156 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ntgrg" Apr 23 14:57:20.993922 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:20.993893 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ntgrg" Apr 23 14:57:21.035320 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.035299 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 14:57:21.044929 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.044903 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-registration-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.044929 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.044934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-etc-selinux\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.045136 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.044952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-cni-dir\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.045136 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.044967 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-hostroot\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.045136 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045025 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-hostroot\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.045136 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-conf-dir\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.045136 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-registration-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.045136 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-cni-dir\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.045136 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045073 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-etc-selinux\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.045136 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-daemon-config\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.045136 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045137 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-conf-dir\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045162 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-log-socket\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045188 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-var-lib-kubelet\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-lib-modules\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-host\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-log-socket\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1399ad28-3a6d-4e8a-9154-bb0eafc7e101-hosts-file\") pod \"node-resolver-2hv6m\" (UID: \"1399ad28-3a6d-4e8a-9154-bb0eafc7e101\") " pod="openshift-dns/node-resolver-2hv6m" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045249 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-var-lib-kubelet\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-socket-dir-parent\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045284 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-host\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045307 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-socket-dir-parent\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045300 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1399ad28-3a6d-4e8a-9154-bb0eafc7e101-hosts-file\") pod \"node-resolver-2hv6m\" (UID: \"1399ad28-3a6d-4e8a-9154-bb0eafc7e101\") " pod="openshift-dns/node-resolver-2hv6m" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045341 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-run-systemd\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045345 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-lib-modules\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43741ca5-8515-4e77-be0b-52d3495c2460-ovnkube-config\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045384 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-run-systemd\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045393 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-run\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.045520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045428 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-run\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/82866f79-d634-4792-b3fd-ef2753feb90f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045470 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-var-lib-openvswitch\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43741ca5-8515-4e77-be0b-52d3495c2460-ovn-node-metrics-cert\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045538 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43741ca5-8515-4e77-be0b-52d3495c2460-ovnkube-script-lib\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b32c0a28-43de-408e-beab-64a0a9ff6ac3-host\") pod \"node-ca-tg5n4\" (UID: \"b32c0a28-43de-408e-beab-64a0a9ff6ac3\") " pod="openshift-image-registry/node-ca-tg5n4" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-systemd-units\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045596 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-var-lib-openvswitch\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-run-ovn-kubernetes\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-cni-bin\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx2p7\" (UniqueName: \"kubernetes.io/projected/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-kube-api-access-mx2p7\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045714 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/018b5460-f98e-40c3-a50c-c43ca05fa0ef-multus-daemon-config\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045738 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-kubernetes\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-sysctl-d\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045812 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-run-ovn-kubernetes\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-tuned\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.046225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045832 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b32c0a28-43de-408e-beab-64a0a9ff6ac3-host\") pod \"node-ca-tg5n4\" (UID: \"b32c0a28-43de-408e-beab-64a0a9ff6ac3\") " pod="openshift-image-registry/node-ca-tg5n4" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045862 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.045818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-systemd-units\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43741ca5-8515-4e77-be0b-52d3495c2460-ovnkube-config\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046114 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-sysctl-d\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-kubernetes\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-sysctl-conf\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046141 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-device-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-device-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046240 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046282 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-cni-bin\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-run-k8s-cni-cncf-io\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046294 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/82866f79-d634-4792-b3fd-ef2753feb90f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-sysctl-conf\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-run-multus-certs\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2c98eab9-7fb6-4067-92ac-e85bf7a1af4b-agent-certs\") pod \"konnectivity-agent-6ttdv\" (UID: \"2c98eab9-7fb6-4067-92ac-e85bf7a1af4b\") " pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-run-k8s-cni-cncf-io\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.047042 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/018b5460-f98e-40c3-a50c-c43ca05fa0ef-cni-binary-copy\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046405 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-run-multus-certs\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046463 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046462 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4-iptables-alerter-script\") pod \"iptables-alerter-k777l\" (UID: \"aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4\") " pod="openshift-network-operator/iptables-alerter-k777l" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1399ad28-3a6d-4e8a-9154-bb0eafc7e101-tmp-dir\") pod \"node-resolver-2hv6m\" (UID: \"1399ad28-3a6d-4e8a-9154-bb0eafc7e101\") " pod="openshift-dns/node-resolver-2hv6m" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-system-cni-dir\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-node-log\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-var-lib-kubelet\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-cnibin\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-node-log\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046595 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/43741ca5-8515-4e77-be0b-52d3495c2460-ovnkube-script-lib\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046579 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-system-cni-dir\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/82866f79-d634-4792-b3fd-ef2753feb90f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046643 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-var-lib-kubelet\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046669 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm97q\" (UniqueName: \"kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q\") pod \"network-check-target-6rls8\" (UID: \"3236e428-70b1-4400-9f33-348489a945df\") " pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046714 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-slash\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.047960 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046689 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-cnibin\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046759 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-cni-netd\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046790 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5sn7\" (UniqueName: \"kubernetes.io/projected/aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4-kube-api-access-n5sn7\") pod \"iptables-alerter-k777l\" (UID: \"aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4\") " pod="openshift-network-operator/iptables-alerter-k777l" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046822 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-modprobe-d\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-sysconfig\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1399ad28-3a6d-4e8a-9154-bb0eafc7e101-tmp-dir\") pod \"node-resolver-2hv6m\" (UID: \"1399ad28-3a6d-4e8a-9154-bb0eafc7e101\") " pod="openshift-dns/node-resolver-2hv6m" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-systemd\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-socket-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-sys-fs\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/018b5460-f98e-40c3-a50c-c43ca05fa0ef-cni-binary-copy\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.046994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-system-cni-dir\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047019 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-slash\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047056 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-sys-fs\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047056 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-modprobe-d\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047074 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-cni-netd\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-sysconfig\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-os-release\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.048750 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047130 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-systemd\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047141 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/82866f79-d634-4792-b3fd-ef2753feb90f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b32c0a28-43de-408e-beab-64a0a9ff6ac3-serviceca\") pod \"node-ca-tg5n4\" (UID: \"b32c0a28-43de-408e-beab-64a0a9ff6ac3\") " pod="openshift-image-registry/node-ca-tg5n4" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-os-release\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047142 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/591b1876-3766-4f35-9533-34a2ea6f684e-socket-dir\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsl4q\" (UniqueName: \"kubernetes.io/projected/1399ad28-3a6d-4e8a-9154-bb0eafc7e101-kube-api-access-nsl4q\") pod \"node-resolver-2hv6m\" (UID: \"1399ad28-3a6d-4e8a-9154-bb0eafc7e101\") " pod="openshift-dns/node-resolver-2hv6m" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/82866f79-d634-4792-b3fd-ef2753feb90f-cni-binary-copy\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zp8b4\" (UniqueName: \"kubernetes.io/projected/82866f79-d634-4792-b3fd-ef2753feb90f-kube-api-access-zp8b4\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047316 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-system-cni-dir\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047335 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-var-lib-cni-bin\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047362 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prxfj\" (UniqueName: \"kubernetes.io/projected/018b5460-f98e-40c3-a50c-c43ca05fa0ef-kube-api-access-prxfj\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-kubelet\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047426 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2c98eab9-7fb6-4067-92ac-e85bf7a1af4b-konnectivity-ca\") pod \"konnectivity-agent-6ttdv\" (UID: \"2c98eab9-7fb6-4067-92ac-e85bf7a1af4b\") " pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047453 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4d459d1-0077-45ce-b190-9cc485dd9ae5-tmp\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047529 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b32c0a28-43de-408e-beab-64a0a9ff6ac3-serviceca\") pod \"node-ca-tg5n4\" (UID: \"b32c0a28-43de-408e-beab-64a0a9ff6ac3\") " pod="openshift-image-registry/node-ca-tg5n4" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43741ca5-8515-4e77-be0b-52d3495c2460-env-overrides\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sv98\" (UniqueName: \"kubernetes.io/projected/43741ca5-8515-4e77-be0b-52d3495c2460-kube-api-access-2sv98\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.049491 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-var-lib-cni-bin\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047646 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-kubelet\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047744 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4-host-slash\") pod \"iptables-alerter-k777l\" (UID: \"aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4\") " pod="openshift-network-operator/iptables-alerter-k777l" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b55hg\" (UniqueName: \"kubernetes.io/projected/e4d459d1-0077-45ce-b190-9cc485dd9ae5-kube-api-access-b55hg\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047817 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9t5lf\" (UniqueName: \"kubernetes.io/projected/591b1876-3766-4f35-9533-34a2ea6f684e-kube-api-access-9t5lf\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-var-lib-cni-multus\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-etc-openvswitch\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-sys\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.047931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-cnibin\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-run-openvswitch\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048108 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-run-ovn\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048130 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43741ca5-8515-4e77-be0b-52d3495c2460-env-overrides\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-var-lib-cni-multus\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-etc-openvswitch\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.048273 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-run-ovn\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-cnibin\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4d459d1-0077-45ce-b190-9cc485dd9ae5-sys\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.050092 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048471 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-run-openvswitch\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/82866f79-d634-4792-b3fd-ef2753feb90f-cni-binary-copy\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.048666 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs podName:be6b2313-a857-46d8-8b0d-adbd4a48cb9d nodeName:}" failed. No retries permitted until 2026-04-23 14:57:21.548620899 +0000 UTC m=+2.055703266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs") pod "network-metrics-daemon-bjpzv" (UID: "be6b2313-a857-46d8-8b0d-adbd4a48cb9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-os-release\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-run-netns\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048772 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-etc-kubernetes\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048796 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l8bl\" (UniqueName: \"kubernetes.io/projected/b32c0a28-43de-408e-beab-64a0a9ff6ac3-kube-api-access-6l8bl\") pod \"node-ca-tg5n4\" (UID: \"b32c0a28-43de-408e-beab-64a0a9ff6ac3\") " pod="openshift-image-registry/node-ca-tg5n4" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048819 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-run-netns\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-host-run-netns\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048911 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/43741ca5-8515-4e77-be0b-52d3495c2460-host-run-netns\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048945 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/018b5460-f98e-40c3-a50c-c43ca05fa0ef-etc-kubernetes\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.048962 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/82866f79-d634-4792-b3fd-ef2753feb90f-os-release\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.049507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4d459d1-0077-45ce-b190-9cc485dd9ae5-etc-tuned\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.049828 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43741ca5-8515-4e77-be0b-52d3495c2460-ovn-node-metrics-cert\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.050599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.049932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4d459d1-0077-45ce-b190-9cc485dd9ae5-tmp\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.058012 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.057986 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:57:21.058129 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.058017 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:57:21.058129 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.058030 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rm97q for pod openshift-network-diagnostics/network-check-target-6rls8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:21.058230 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.058220 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q podName:3236e428-70b1-4400-9f33-348489a945df nodeName:}" failed. No retries permitted until 2026-04-23 14:57:21.558191702 +0000 UTC m=+2.065274065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rm97q" (UniqueName: "kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q") pod "network-check-target-6rls8" (UID: "3236e428-70b1-4400-9f33-348489a945df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:21.060631 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.060609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx2p7\" (UniqueName: \"kubernetes.io/projected/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-kube-api-access-mx2p7\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:21.061252 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.061227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prxfj\" (UniqueName: \"kubernetes.io/projected/018b5460-f98e-40c3-a50c-c43ca05fa0ef-kube-api-access-prxfj\") pod \"multus-w69vk\" (UID: \"018b5460-f98e-40c3-a50c-c43ca05fa0ef\") " pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.061727 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.061688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b55hg\" (UniqueName: \"kubernetes.io/projected/e4d459d1-0077-45ce-b190-9cc485dd9ae5-kube-api-access-b55hg\") pod \"tuned-nhrx7\" (UID: \"e4d459d1-0077-45ce-b190-9cc485dd9ae5\") " pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.062882 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.062855 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp8b4\" (UniqueName: \"kubernetes.io/projected/82866f79-d634-4792-b3fd-ef2753feb90f-kube-api-access-zp8b4\") pod \"multus-additional-cni-plugins-jhxf7\" (UID: \"82866f79-d634-4792-b3fd-ef2753feb90f\") " pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.063640 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.063603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t5lf\" (UniqueName: \"kubernetes.io/projected/591b1876-3766-4f35-9533-34a2ea6f684e-kube-api-access-9t5lf\") pod \"aws-ebs-csi-driver-node-kq7jc\" (UID: \"591b1876-3766-4f35-9533-34a2ea6f684e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.064571 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.064547 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sv98\" (UniqueName: \"kubernetes.io/projected/43741ca5-8515-4e77-be0b-52d3495c2460-kube-api-access-2sv98\") pod \"ovnkube-node-fpxb2\" (UID: \"43741ca5-8515-4e77-be0b-52d3495c2460\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.064659 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.064641 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l8bl\" (UniqueName: \"kubernetes.io/projected/b32c0a28-43de-408e-beab-64a0a9ff6ac3-kube-api-access-6l8bl\") pod \"node-ca-tg5n4\" (UID: \"b32c0a28-43de-408e-beab-64a0a9ff6ac3\") " pod="openshift-image-registry/node-ca-tg5n4" Apr 23 14:57:21.065122 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.065103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsl4q\" (UniqueName: \"kubernetes.io/projected/1399ad28-3a6d-4e8a-9154-bb0eafc7e101-kube-api-access-nsl4q\") pod \"node-resolver-2hv6m\" (UID: \"1399ad28-3a6d-4e8a-9154-bb0eafc7e101\") " pod="openshift-dns/node-resolver-2hv6m" Apr 23 14:57:21.069390 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.069371 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w69vk" Apr 23 14:57:21.093000 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:21.092954 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod018b5460_f98e_40c3_a50c_c43ca05fa0ef.slice/crio-21411b1dcffb263e0814dd6adb47ef7c56ebf5a19b50af95787a2c12f495c717 WatchSource:0}: Error finding container 21411b1dcffb263e0814dd6adb47ef7c56ebf5a19b50af95787a2c12f495c717: Status 404 returned error can't find the container with id 21411b1dcffb263e0814dd6adb47ef7c56ebf5a19b50af95787a2c12f495c717 Apr 23 14:57:21.094397 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:21.094377 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod237efac7542ae805317afa8331e5e27b.slice/crio-d692e1c2018d7330a5f60e6fff0895b79ef5850f3be1696fd338b3dd5c8caf0c WatchSource:0}: Error finding container d692e1c2018d7330a5f60e6fff0895b79ef5850f3be1696fd338b3dd5c8caf0c: Status 404 returned error can't find the container with id d692e1c2018d7330a5f60e6fff0895b79ef5850f3be1696fd338b3dd5c8caf0c Apr 23 14:57:21.095186 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:21.095171 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27d97c8c240a436d06b1c4f45cd224be.slice/crio-2ed715e609ad4d318d4acdfebd9beaa2e31ae3b926d4615ada1f674394c4cf57 WatchSource:0}: Error finding container 2ed715e609ad4d318d4acdfebd9beaa2e31ae3b926d4615ada1f674394c4cf57: Status 404 returned error can't find the container with id 2ed715e609ad4d318d4acdfebd9beaa2e31ae3b926d4615ada1f674394c4cf57 Apr 23 14:57:21.097795 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.097780 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:57:21.149276 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.149237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2c98eab9-7fb6-4067-92ac-e85bf7a1af4b-agent-certs\") pod \"konnectivity-agent-6ttdv\" (UID: \"2c98eab9-7fb6-4067-92ac-e85bf7a1af4b\") " pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:21.149463 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.149288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4-iptables-alerter-script\") pod \"iptables-alerter-k777l\" (UID: \"aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4\") " pod="openshift-network-operator/iptables-alerter-k777l" Apr 23 14:57:21.149537 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.149460 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5sn7\" (UniqueName: \"kubernetes.io/projected/aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4-kube-api-access-n5sn7\") pod \"iptables-alerter-k777l\" (UID: \"aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4\") " pod="openshift-network-operator/iptables-alerter-k777l" Apr 23 14:57:21.149591 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.149552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2c98eab9-7fb6-4067-92ac-e85bf7a1af4b-konnectivity-ca\") pod \"konnectivity-agent-6ttdv\" (UID: \"2c98eab9-7fb6-4067-92ac-e85bf7a1af4b\") " pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:21.149591 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.149582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4-host-slash\") pod \"iptables-alerter-k777l\" (UID: \"aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4\") " pod="openshift-network-operator/iptables-alerter-k777l" Apr 23 14:57:21.149850 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.149820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4-host-slash\") pod \"iptables-alerter-k777l\" (UID: \"aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4\") " pod="openshift-network-operator/iptables-alerter-k777l" Apr 23 14:57:21.149943 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.149925 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4-iptables-alerter-script\") pod \"iptables-alerter-k777l\" (UID: \"aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4\") " pod="openshift-network-operator/iptables-alerter-k777l" Apr 23 14:57:21.150225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.150204 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2c98eab9-7fb6-4067-92ac-e85bf7a1af4b-konnectivity-ca\") pod \"konnectivity-agent-6ttdv\" (UID: \"2c98eab9-7fb6-4067-92ac-e85bf7a1af4b\") " pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:21.152198 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.152175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2c98eab9-7fb6-4067-92ac-e85bf7a1af4b-agent-certs\") pod \"konnectivity-agent-6ttdv\" (UID: \"2c98eab9-7fb6-4067-92ac-e85bf7a1af4b\") " pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:21.163352 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.163329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5sn7\" (UniqueName: \"kubernetes.io/projected/aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4-kube-api-access-n5sn7\") pod \"iptables-alerter-k777l\" (UID: \"aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4\") " pod="openshift-network-operator/iptables-alerter-k777l" Apr 23 14:57:21.265354 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.265264 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" Apr 23 14:57:21.271181 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:21.271156 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4d459d1_0077_45ce_b190_9cc485dd9ae5.slice/crio-6096189420d508255c730e5ff520c3dce07e584693bedc9e917bad72c4005f32 WatchSource:0}: Error finding container 6096189420d508255c730e5ff520c3dce07e584693bedc9e917bad72c4005f32: Status 404 returned error can't find the container with id 6096189420d508255c730e5ff520c3dce07e584693bedc9e917bad72c4005f32 Apr 23 14:57:21.280153 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.280132 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tg5n4" Apr 23 14:57:21.285656 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:21.285620 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb32c0a28_43de_408e_beab_64a0a9ff6ac3.slice/crio-0a8b41c68038e9c2d3dda3478383763d44c71d0db8fefb48e193f638d1598440 WatchSource:0}: Error finding container 0a8b41c68038e9c2d3dda3478383763d44c71d0db8fefb48e193f638d1598440: Status 404 returned error can't find the container with id 0a8b41c68038e9c2d3dda3478383763d44c71d0db8fefb48e193f638d1598440 Apr 23 14:57:21.288598 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.288581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:21.294998 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:21.294976 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43741ca5_8515_4e77_be0b_52d3495c2460.slice/crio-2762162d287dbd8393497ed6c630ec80575e74aa65aa717bf4750e229ca30e67 WatchSource:0}: Error finding container 2762162d287dbd8393497ed6c630ec80575e74aa65aa717bf4750e229ca30e67: Status 404 returned error can't find the container with id 2762162d287dbd8393497ed6c630ec80575e74aa65aa717bf4750e229ca30e67 Apr 23 14:57:21.296867 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.296850 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:57:21.312073 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.312048 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" Apr 23 14:57:21.317871 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:21.317842 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod591b1876_3766_4f35_9533_34a2ea6f684e.slice/crio-0df98280cc4766110f0e7120265e2c42f8ac725a1b95b8ac345475fc5032bf68 WatchSource:0}: Error finding container 0df98280cc4766110f0e7120265e2c42f8ac725a1b95b8ac345475fc5032bf68: Status 404 returned error can't find the container with id 0df98280cc4766110f0e7120265e2c42f8ac725a1b95b8ac345475fc5032bf68 Apr 23 14:57:21.330446 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.330422 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2hv6m" Apr 23 14:57:21.336115 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:21.336090 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1399ad28_3a6d_4e8a_9154_bb0eafc7e101.slice/crio-b8cbe1da7086bc7af1b5ac8d1726f4003cd6a6bf9ca289d0ee6ba0b9b3a0a5e7 WatchSource:0}: Error finding container b8cbe1da7086bc7af1b5ac8d1726f4003cd6a6bf9ca289d0ee6ba0b9b3a0a5e7: Status 404 returned error can't find the container with id b8cbe1da7086bc7af1b5ac8d1726f4003cd6a6bf9ca289d0ee6ba0b9b3a0a5e7 Apr 23 14:57:21.350563 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.350542 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jhxf7" Apr 23 14:57:21.356736 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:21.356690 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82866f79_d634_4792_b3fd_ef2753feb90f.slice/crio-5b53a0e0be7e2f1ae0a36fe9f36499f40c8b54fc44182951c1251f6b4bc1bb2d WatchSource:0}: Error finding container 5b53a0e0be7e2f1ae0a36fe9f36499f40c8b54fc44182951c1251f6b4bc1bb2d: Status 404 returned error can't find the container with id 5b53a0e0be7e2f1ae0a36fe9f36499f40c8b54fc44182951c1251f6b4bc1bb2d Apr 23 14:57:21.386242 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.386209 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k777l" Apr 23 14:57:21.391011 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.390990 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:21.393661 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:21.393637 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa2c92c0_87e9_40a1_8ef8_7cb7d5e860d4.slice/crio-1b9a4416e47e4a551d7c1cf12bc7e596afde928c7c732506485fb3512ac1ad73 WatchSource:0}: Error finding container 1b9a4416e47e4a551d7c1cf12bc7e596afde928c7c732506485fb3512ac1ad73: Status 404 returned error can't find the container with id 1b9a4416e47e4a551d7c1cf12bc7e596afde928c7c732506485fb3512ac1ad73 Apr 23 14:57:21.401201 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:57:21.400136 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c98eab9_7fb6_4067_92ac_e85bf7a1af4b.slice/crio-787aa3e4c155c5d76ee7899fa4041b614404882f41e60489b37e52a3f668f667 WatchSource:0}: Error finding container 787aa3e4c155c5d76ee7899fa4041b614404882f41e60489b37e52a3f668f667: Status 404 returned error can't find the container with id 787aa3e4c155c5d76ee7899fa4041b614404882f41e60489b37e52a3f668f667 Apr 23 14:57:21.552667 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.552567 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:21.552831 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.552730 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:21.552831 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.552799 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs podName:be6b2313-a857-46d8-8b0d-adbd4a48cb9d nodeName:}" failed. No retries permitted until 2026-04-23 14:57:22.552777718 +0000 UTC m=+3.059860105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs") pod "network-metrics-daemon-bjpzv" (UID: "be6b2313-a857-46d8-8b0d-adbd4a48cb9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:21.654329 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.653667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm97q\" (UniqueName: \"kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q\") pod \"network-check-target-6rls8\" (UID: \"3236e428-70b1-4400-9f33-348489a945df\") " pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:21.654329 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.653875 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:57:21.654329 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.653895 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:57:21.654329 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.653907 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rm97q for pod openshift-network-diagnostics/network-check-target-6rls8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:21.654329 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:21.653966 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q podName:3236e428-70b1-4400-9f33-348489a945df nodeName:}" failed. No retries permitted until 2026-04-23 14:57:22.653948244 +0000 UTC m=+3.161030625 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rm97q" (UniqueName: "kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q") pod "network-check-target-6rls8" (UID: "3236e428-70b1-4400-9f33-348489a945df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:21.996619 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.996545 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 14:52:20 +0000 UTC" deadline="2027-09-21 11:20:28.481621378 +0000 UTC" Apr 23 14:57:21.996619 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:21.996570 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12380h23m6.485053713s" Apr 23 14:57:22.021331 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.021084 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:57:22.115027 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.114913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tg5n4" event={"ID":"b32c0a28-43de-408e-beab-64a0a9ff6ac3","Type":"ContainerStarted","Data":"0a8b41c68038e9c2d3dda3478383763d44c71d0db8fefb48e193f638d1598440"} Apr 23 14:57:22.147771 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.147543 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" event={"ID":"27d97c8c240a436d06b1c4f45cd224be","Type":"ContainerStarted","Data":"2ed715e609ad4d318d4acdfebd9beaa2e31ae3b926d4615ada1f674394c4cf57"} Apr 23 14:57:22.172331 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.172250 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" event={"ID":"237efac7542ae805317afa8331e5e27b","Type":"ContainerStarted","Data":"d692e1c2018d7330a5f60e6fff0895b79ef5850f3be1696fd338b3dd5c8caf0c"} Apr 23 14:57:22.187052 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.186978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2hv6m" event={"ID":"1399ad28-3a6d-4e8a-9154-bb0eafc7e101","Type":"ContainerStarted","Data":"b8cbe1da7086bc7af1b5ac8d1726f4003cd6a6bf9ca289d0ee6ba0b9b3a0a5e7"} Apr 23 14:57:22.210295 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.210234 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" event={"ID":"591b1876-3766-4f35-9533-34a2ea6f684e","Type":"ContainerStarted","Data":"0df98280cc4766110f0e7120265e2c42f8ac725a1b95b8ac345475fc5032bf68"} Apr 23 14:57:22.220049 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.220011 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" event={"ID":"43741ca5-8515-4e77-be0b-52d3495c2460","Type":"ContainerStarted","Data":"2762162d287dbd8393497ed6c630ec80575e74aa65aa717bf4750e229ca30e67"} Apr 23 14:57:22.239155 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.239071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" event={"ID":"e4d459d1-0077-45ce-b190-9cc485dd9ae5","Type":"ContainerStarted","Data":"6096189420d508255c730e5ff520c3dce07e584693bedc9e917bad72c4005f32"} Apr 23 14:57:22.270342 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.270244 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w69vk" event={"ID":"018b5460-f98e-40c3-a50c-c43ca05fa0ef","Type":"ContainerStarted","Data":"21411b1dcffb263e0814dd6adb47ef7c56ebf5a19b50af95787a2c12f495c717"} Apr 23 14:57:22.294465 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.294426 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6ttdv" event={"ID":"2c98eab9-7fb6-4067-92ac-e85bf7a1af4b","Type":"ContainerStarted","Data":"787aa3e4c155c5d76ee7899fa4041b614404882f41e60489b37e52a3f668f667"} Apr 23 14:57:22.328745 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.328689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k777l" event={"ID":"aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4","Type":"ContainerStarted","Data":"1b9a4416e47e4a551d7c1cf12bc7e596afde928c7c732506485fb3512ac1ad73"} Apr 23 14:57:22.336961 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.336906 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jhxf7" event={"ID":"82866f79-d634-4792-b3fd-ef2753feb90f","Type":"ContainerStarted","Data":"5b53a0e0be7e2f1ae0a36fe9f36499f40c8b54fc44182951c1251f6b4bc1bb2d"} Apr 23 14:57:22.430877 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.430847 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:57:22.563828 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.563227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:22.563828 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:22.563358 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:22.563828 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:22.563418 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs podName:be6b2313-a857-46d8-8b0d-adbd4a48cb9d nodeName:}" failed. No retries permitted until 2026-04-23 14:57:24.563399782 +0000 UTC m=+5.070482161 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs") pod "network-metrics-daemon-bjpzv" (UID: "be6b2313-a857-46d8-8b0d-adbd4a48cb9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:22.664089 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.664055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm97q\" (UniqueName: \"kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q\") pod \"network-check-target-6rls8\" (UID: \"3236e428-70b1-4400-9f33-348489a945df\") " pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:22.664268 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:22.664239 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:57:22.664268 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:22.664261 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:57:22.664372 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:22.664274 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rm97q for pod openshift-network-diagnostics/network-check-target-6rls8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:22.664372 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:22.664336 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q podName:3236e428-70b1-4400-9f33-348489a945df nodeName:}" failed. No retries permitted until 2026-04-23 14:57:24.664314855 +0000 UTC m=+5.171397235 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rm97q" (UniqueName: "kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q") pod "network-check-target-6rls8" (UID: "3236e428-70b1-4400-9f33-348489a945df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:22.996971 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.996875 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 14:52:20 +0000 UTC" deadline="2027-09-18 11:58:06.68095129 +0000 UTC" Apr 23 14:57:22.996971 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:22.996915 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12309h0m43.684041349s" Apr 23 14:57:23.091127 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:23.090397 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:23.091127 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:23.090551 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:23.091127 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:23.090980 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:23.091127 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:23.091067 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:23.499806 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:23.499504 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 14:57:24.579634 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:24.579494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:24.580089 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:24.579657 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:24.580089 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:24.579749 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs podName:be6b2313-a857-46d8-8b0d-adbd4a48cb9d nodeName:}" failed. No retries permitted until 2026-04-23 14:57:28.579728361 +0000 UTC m=+9.086810751 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs") pod "network-metrics-daemon-bjpzv" (UID: "be6b2313-a857-46d8-8b0d-adbd4a48cb9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:24.680937 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:24.680276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm97q\" (UniqueName: \"kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q\") pod \"network-check-target-6rls8\" (UID: \"3236e428-70b1-4400-9f33-348489a945df\") " pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:24.680937 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:24.680488 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:57:24.680937 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:24.680510 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:57:24.680937 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:24.680524 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rm97q for pod openshift-network-diagnostics/network-check-target-6rls8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:24.680937 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:24.680582 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q podName:3236e428-70b1-4400-9f33-348489a945df nodeName:}" failed. No retries permitted until 2026-04-23 14:57:28.680564145 +0000 UTC m=+9.187646514 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rm97q" (UniqueName: "kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q") pod "network-check-target-6rls8" (UID: "3236e428-70b1-4400-9f33-348489a945df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:25.090049 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:25.090008 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:25.090238 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:25.090019 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:25.090238 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:25.090160 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:25.090238 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:25.090209 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:27.090514 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:27.090482 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:27.090977 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:27.090482 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:27.090977 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:27.090657 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:27.090977 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:27.090741 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:28.615322 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:28.615131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:28.615322 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:28.615297 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:28.615937 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:28.615371 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs podName:be6b2313-a857-46d8-8b0d-adbd4a48cb9d nodeName:}" failed. No retries permitted until 2026-04-23 14:57:36.615351166 +0000 UTC m=+17.122433533 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs") pod "network-metrics-daemon-bjpzv" (UID: "be6b2313-a857-46d8-8b0d-adbd4a48cb9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:28.716563 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:28.716500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm97q\" (UniqueName: \"kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q\") pod \"network-check-target-6rls8\" (UID: \"3236e428-70b1-4400-9f33-348489a945df\") " pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:28.716728 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:28.716658 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:57:28.716800 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:28.716750 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:57:28.716800 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:28.716765 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rm97q for pod openshift-network-diagnostics/network-check-target-6rls8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:28.716887 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:28.716870 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q podName:3236e428-70b1-4400-9f33-348489a945df nodeName:}" failed. No retries permitted until 2026-04-23 14:57:36.716846821 +0000 UTC m=+17.223929202 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rm97q" (UniqueName: "kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q") pod "network-check-target-6rls8" (UID: "3236e428-70b1-4400-9f33-348489a945df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:29.090084 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:29.090047 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:29.090249 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:29.090167 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:29.090249 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:29.090226 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:29.090382 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:29.090338 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:31.089716 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:31.089670 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:31.090158 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:31.089739 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:31.090158 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:31.089822 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:31.090158 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:31.089957 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:33.089728 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:33.089677 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:33.090169 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:33.089736 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:33.090169 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:33.089843 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:33.090169 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:33.089958 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:35.090403 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:35.090326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:35.090834 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:35.090332 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:35.090834 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:35.090459 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:35.090834 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:35.090534 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:36.676808 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:36.676767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:36.677331 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:36.676929 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:36.677331 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:36.677000 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs podName:be6b2313-a857-46d8-8b0d-adbd4a48cb9d nodeName:}" failed. No retries permitted until 2026-04-23 14:57:52.676985411 +0000 UTC m=+33.184067775 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs") pod "network-metrics-daemon-bjpzv" (UID: "be6b2313-a857-46d8-8b0d-adbd4a48cb9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:36.777586 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:36.777551 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm97q\" (UniqueName: \"kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q\") pod \"network-check-target-6rls8\" (UID: \"3236e428-70b1-4400-9f33-348489a945df\") " pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:36.777776 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:36.777690 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:57:36.777776 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:36.777726 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:57:36.777776 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:36.777741 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rm97q for pod openshift-network-diagnostics/network-check-target-6rls8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:36.777926 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:36.777803 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q podName:3236e428-70b1-4400-9f33-348489a945df nodeName:}" failed. No retries permitted until 2026-04-23 14:57:52.777783708 +0000 UTC m=+33.284866101 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rm97q" (UniqueName: "kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q") pod "network-check-target-6rls8" (UID: "3236e428-70b1-4400-9f33-348489a945df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:37.090497 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:37.090403 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:37.090644 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:37.090537 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:37.090644 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:37.090574 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:37.090644 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:37.090634 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:39.090303 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:39.090258 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:39.090656 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:39.090260 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:39.090656 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:39.090371 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:39.090656 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:39.090447 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:40.371284 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.370920 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" event={"ID":"237efac7542ae805317afa8331e5e27b","Type":"ContainerStarted","Data":"a5e88e2e3a4d400eb4750c737fa412c7cef7835a1c1d3f31330fc46acb76066c"} Apr 23 14:57:40.373508 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.373489 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 14:57:40.373737 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.373715 2577 generic.go:358] "Generic (PLEG): container finished" podID="43741ca5-8515-4e77-be0b-52d3495c2460" containerID="15272c8931fd6a98e7c16c27736c5f5a9e9fa3621f1c7b9dfb570b046783ba39" exitCode=1 Apr 23 14:57:40.373790 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.373767 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" event={"ID":"43741ca5-8515-4e77-be0b-52d3495c2460","Type":"ContainerStarted","Data":"1e7d4fb974212ff124d535c06fa67a5e98c2857d29bc49741480ba5111d385a1"} Apr 23 14:57:40.373790 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.373785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" event={"ID":"43741ca5-8515-4e77-be0b-52d3495c2460","Type":"ContainerStarted","Data":"e9a4a90f4d91dbb063e4d23c7e08d988a9be579663ed62496b1fdf94c4211412"} Apr 23 14:57:40.373855 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.373794 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" event={"ID":"43741ca5-8515-4e77-be0b-52d3495c2460","Type":"ContainerStarted","Data":"5717ec6359e5fde03cdf0c23702bf2a7d521b6dcd1aeb238d572d03d86655a1b"} Apr 23 14:57:40.373855 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.373804 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" event={"ID":"43741ca5-8515-4e77-be0b-52d3495c2460","Type":"ContainerStarted","Data":"b7f410805508c27baa168d08d4cd7fee7518554381c00956d3e51df8884423cf"} Apr 23 14:57:40.373855 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.373816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" event={"ID":"43741ca5-8515-4e77-be0b-52d3495c2460","Type":"ContainerDied","Data":"15272c8931fd6a98e7c16c27736c5f5a9e9fa3621f1c7b9dfb570b046783ba39"} Apr 23 14:57:40.373855 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.373826 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" event={"ID":"43741ca5-8515-4e77-be0b-52d3495c2460","Type":"ContainerStarted","Data":"bc13d8291f2783e70741962bfc9aed4a071fc10c6b14a5974222a90a465bb3d7"} Apr 23 14:57:40.374631 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.374614 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" event={"ID":"e4d459d1-0077-45ce-b190-9cc485dd9ae5","Type":"ContainerStarted","Data":"68b62266c8613be02b45dc11b5719b91bf83306cd4500b8331539834dad0d42b"} Apr 23 14:57:40.376284 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.376262 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w69vk" event={"ID":"018b5460-f98e-40c3-a50c-c43ca05fa0ef","Type":"ContainerStarted","Data":"c5cd9a55218b65e8750396407e4261e95c50f4f9922efbab9ea2c837ffc389e0"} Apr 23 14:57:40.419168 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.419117 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" podStartSLOduration=20.41910129 podStartE2EDuration="20.41910129s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:57:40.393872202 +0000 UTC m=+20.900954587" watchObservedRunningTime="2026-04-23 14:57:40.41910129 +0000 UTC m=+20.926183666" Apr 23 14:57:40.422666 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.422621 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w69vk" podStartSLOduration=1.7403795560000002 podStartE2EDuration="20.422606041s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="2026-04-23 14:57:21.098087477 +0000 UTC m=+1.605169848" lastFinishedPulling="2026-04-23 14:57:39.780313959 +0000 UTC m=+20.287396333" observedRunningTime="2026-04-23 14:57:40.419338165 +0000 UTC m=+20.926420551" watchObservedRunningTime="2026-04-23 14:57:40.422606041 +0000 UTC m=+20.929688426" Apr 23 14:57:40.443675 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:40.443633 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nhrx7" podStartSLOduration=2.3658256460000002 podStartE2EDuration="20.443620478s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="2026-04-23 14:57:21.272811213 +0000 UTC m=+1.779893580" lastFinishedPulling="2026-04-23 14:57:39.350606044 +0000 UTC m=+19.857688412" observedRunningTime="2026-04-23 14:57:40.443470222 +0000 UTC m=+20.950552621" watchObservedRunningTime="2026-04-23 14:57:40.443620478 +0000 UTC m=+20.950702863" Apr 23 14:57:41.089504 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.089475 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:41.089680 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.089475 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:41.089680 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:41.089586 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:41.089680 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:41.089657 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:41.378792 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.378710 2577 generic.go:358] "Generic (PLEG): container finished" podID="82866f79-d634-4792-b3fd-ef2753feb90f" containerID="aefbef471cef39576c2f07dea52bde50f9b29ee5c5652c804940f6e249aa8d91" exitCode=0 Apr 23 14:57:41.378792 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.378734 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jhxf7" event={"ID":"82866f79-d634-4792-b3fd-ef2753feb90f","Type":"ContainerDied","Data":"aefbef471cef39576c2f07dea52bde50f9b29ee5c5652c804940f6e249aa8d91"} Apr 23 14:57:41.380224 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.380140 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tg5n4" event={"ID":"b32c0a28-43de-408e-beab-64a0a9ff6ac3","Type":"ContainerStarted","Data":"12d61d4fecdb1ed1f536f9ed6f11789b108a9cf8f7c052303121b706721c7a30"} Apr 23 14:57:41.381594 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.381575 2577 generic.go:358] "Generic (PLEG): container finished" podID="27d97c8c240a436d06b1c4f45cd224be" containerID="1a91e64866030eaf494d071bab8d4a02c7b17315a080269b42f279f1a0292ec0" exitCode=0 Apr 23 14:57:41.381651 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.381634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" event={"ID":"27d97c8c240a436d06b1c4f45cd224be","Type":"ContainerDied","Data":"1a91e64866030eaf494d071bab8d4a02c7b17315a080269b42f279f1a0292ec0"} Apr 23 14:57:41.382939 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.382913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2hv6m" event={"ID":"1399ad28-3a6d-4e8a-9154-bb0eafc7e101","Type":"ContainerStarted","Data":"2851af6845ac7be3cbfad313c847c7265ecead9fc670b9a1c8276d2b87f87190"} Apr 23 14:57:41.384128 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.384107 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" event={"ID":"591b1876-3766-4f35-9533-34a2ea6f684e","Type":"ContainerStarted","Data":"7ace11d3d41d4790df5d927fefd51a36a0edc94ec31804fc6c7e9d4aad432cb4"} Apr 23 14:57:41.385294 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.385274 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6ttdv" event={"ID":"2c98eab9-7fb6-4067-92ac-e85bf7a1af4b","Type":"ContainerStarted","Data":"6a1a3df461a3f0bb20c5a7434036806c402d9f5bf4b6779302a692610dcacf6d"} Apr 23 14:57:41.386357 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.386332 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k777l" event={"ID":"aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4","Type":"ContainerStarted","Data":"1c4f3cf4858b1e10f73b47ddb35b55c7766e17101b0580d7e4d157cfe0ef917a"} Apr 23 14:57:41.425589 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.425545 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2hv6m" podStartSLOduration=3.438304158 podStartE2EDuration="21.425530283s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="2026-04-23 14:57:21.337578063 +0000 UTC m=+1.844660431" lastFinishedPulling="2026-04-23 14:57:39.324804182 +0000 UTC m=+19.831886556" observedRunningTime="2026-04-23 14:57:41.425324781 +0000 UTC m=+21.932407168" watchObservedRunningTime="2026-04-23 14:57:41.425530283 +0000 UTC m=+21.932612668" Apr 23 14:57:41.467729 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.467658 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-k777l" podStartSLOduration=3.51385763 podStartE2EDuration="21.46763897s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="2026-04-23 14:57:21.395170638 +0000 UTC m=+1.902253006" lastFinishedPulling="2026-04-23 14:57:39.348951968 +0000 UTC m=+19.856034346" observedRunningTime="2026-04-23 14:57:41.467133263 +0000 UTC m=+21.974215649" watchObservedRunningTime="2026-04-23 14:57:41.46763897 +0000 UTC m=+21.974721358" Apr 23 14:57:41.467971 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.467939 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tg5n4" podStartSLOduration=3.406813347 podStartE2EDuration="21.467929059s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="2026-04-23 14:57:21.28717875 +0000 UTC m=+1.794261117" lastFinishedPulling="2026-04-23 14:57:39.348294465 +0000 UTC m=+19.855376829" observedRunningTime="2026-04-23 14:57:41.443422167 +0000 UTC m=+21.950504552" watchObservedRunningTime="2026-04-23 14:57:41.467929059 +0000 UTC m=+21.975011445" Apr 23 14:57:41.507263 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.507217 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6ttdv" podStartSLOduration=3.6159724730000002 podStartE2EDuration="21.507198033s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="2026-04-23 14:57:21.403020547 +0000 UTC m=+1.910102915" lastFinishedPulling="2026-04-23 14:57:39.294246093 +0000 UTC m=+19.801328475" observedRunningTime="2026-04-23 14:57:41.489404798 +0000 UTC m=+21.996487185" watchObservedRunningTime="2026-04-23 14:57:41.507198033 +0000 UTC m=+22.014280422" Apr 23 14:57:41.563785 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:41.563614 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 14:57:42.019395 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:42.019294 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T14:57:41.563780605Z","UUID":"156e49fb-1c73-46d3-965e-8757f4c408a8","Handler":null,"Name":"","Endpoint":""} Apr 23 14:57:42.022148 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:42.022123 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 14:57:42.022148 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:42.022150 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 14:57:42.390748 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:42.390631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" event={"ID":"27d97c8c240a436d06b1c4f45cd224be","Type":"ContainerStarted","Data":"5b388546ba87929708caaffe357d1d49a946b2d441d7d077c3c8068041e228fd"} Apr 23 14:57:42.392693 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:42.392660 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" event={"ID":"591b1876-3766-4f35-9533-34a2ea6f684e","Type":"ContainerStarted","Data":"94315ed58137af9f2c93bb9ffbb552ac5104de1e49935c5c4d9daaea76486e08"} Apr 23 14:57:42.395962 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:42.395930 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 14:57:42.396751 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:42.396724 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" event={"ID":"43741ca5-8515-4e77-be0b-52d3495c2460","Type":"ContainerStarted","Data":"1e08d5e3340ba1d00292cbd33ba30de6bafdbd80aff8db91432587a0d0e2f141"} Apr 23 14:57:42.407802 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:42.407755 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" podStartSLOduration=22.407741077 podStartE2EDuration="22.407741077s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:57:42.407195319 +0000 UTC m=+22.914277705" watchObservedRunningTime="2026-04-23 14:57:42.407741077 +0000 UTC m=+22.914823466" Apr 23 14:57:43.090281 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:43.090192 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:43.090451 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:43.090192 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:43.090451 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:43.090337 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:43.090451 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:43.090413 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:43.400586 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:43.400549 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" event={"ID":"591b1876-3766-4f35-9533-34a2ea6f684e","Type":"ContainerStarted","Data":"460f62fcd2335affe1b3f26692cee7620999d56b5a086985be632a5b864e4f30"} Apr 23 14:57:43.419577 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:43.419522 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kq7jc" podStartSLOduration=2.17978267 podStartE2EDuration="23.419502835s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="2026-04-23 14:57:21.320298128 +0000 UTC m=+1.827380494" lastFinishedPulling="2026-04-23 14:57:42.560018291 +0000 UTC m=+23.067100659" observedRunningTime="2026-04-23 14:57:43.419379872 +0000 UTC m=+23.926462259" watchObservedRunningTime="2026-04-23 14:57:43.419502835 +0000 UTC m=+23.926585222" Apr 23 14:57:45.089737 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:45.089684 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:45.090328 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:45.089684 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:45.090328 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:45.089834 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:45.090328 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:45.089869 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:46.034894 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:46.034619 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:46.035341 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:46.035323 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:46.407784 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:46.407759 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 14:57:46.408189 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:46.408066 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" event={"ID":"43741ca5-8515-4e77-be0b-52d3495c2460","Type":"ContainerStarted","Data":"8cba331136ec5f5d07b87f2cada3b9db14e157b3d262c930712d01a4221b2013"} Apr 23 14:57:46.408409 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:46.408384 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:46.408508 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:46.408414 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:46.408587 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:46.408570 2577 scope.go:117] "RemoveContainer" containerID="15272c8931fd6a98e7c16c27736c5f5a9e9fa3621f1c7b9dfb570b046783ba39" Apr 23 14:57:46.410022 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:46.409564 2577 generic.go:358] "Generic (PLEG): container finished" podID="82866f79-d634-4792-b3fd-ef2753feb90f" containerID="071a3e910b2c0ffa8305469e50c8bd5c108965e86c25e048bd141155714d2908" exitCode=0 Apr 23 14:57:46.410022 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:46.409634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jhxf7" event={"ID":"82866f79-d634-4792-b3fd-ef2753feb90f","Type":"ContainerDied","Data":"071a3e910b2c0ffa8305469e50c8bd5c108965e86c25e048bd141155714d2908"} Apr 23 14:57:46.425156 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:46.425130 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:47.089503 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:47.089464 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:47.089674 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:47.089517 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:47.089674 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:47.089591 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:47.089805 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:47.089727 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:47.413456 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:47.413419 2577 generic.go:358] "Generic (PLEG): container finished" podID="82866f79-d634-4792-b3fd-ef2753feb90f" containerID="56102d0da1c1885bab5237c2a071bdb7900ea49c172a31c4f15e73e823970f3c" exitCode=0 Apr 23 14:57:47.413879 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:47.413508 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jhxf7" event={"ID":"82866f79-d634-4792-b3fd-ef2753feb90f","Type":"ContainerDied","Data":"56102d0da1c1885bab5237c2a071bdb7900ea49c172a31c4f15e73e823970f3c"} Apr 23 14:57:47.416805 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:47.416786 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 14:57:47.417098 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:47.417071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" event={"ID":"43741ca5-8515-4e77-be0b-52d3495c2460","Type":"ContainerStarted","Data":"d5714801b19a6a88fa8ca614585f0e1c87b4ba0a133d473126e3535ad9ce8157"} Apr 23 14:57:47.417319 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:47.417296 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:47.431085 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:47.431054 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:57:47.471233 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:47.471189 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" podStartSLOduration=9.372916333 podStartE2EDuration="27.471175516s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="2026-04-23 14:57:21.296508357 +0000 UTC m=+1.803590724" lastFinishedPulling="2026-04-23 14:57:39.394767532 +0000 UTC m=+19.901849907" observedRunningTime="2026-04-23 14:57:47.470184691 +0000 UTC m=+27.977267067" watchObservedRunningTime="2026-04-23 14:57:47.471175516 +0000 UTC m=+27.978257901" Apr 23 14:57:48.421470 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:48.421439 2577 generic.go:358] "Generic (PLEG): container finished" podID="82866f79-d634-4792-b3fd-ef2753feb90f" containerID="453ea73b6d5ca198a9eb060291915cbfb6b0b32ef64aa7fe867934d1d7236a5a" exitCode=0 Apr 23 14:57:48.421881 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:48.421524 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jhxf7" event={"ID":"82866f79-d634-4792-b3fd-ef2753feb90f","Type":"ContainerDied","Data":"453ea73b6d5ca198a9eb060291915cbfb6b0b32ef64aa7fe867934d1d7236a5a"} Apr 23 14:57:49.090357 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:49.090323 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:49.090527 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:49.090323 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:49.090527 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:49.090453 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:49.090527 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:49.090518 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:49.862318 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:49.862061 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:49.862763 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:49.862460 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 14:57:49.862834 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:49.862814 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6ttdv" Apr 23 14:57:51.089588 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:51.089533 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:51.090040 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:51.089539 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:51.090040 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:51.089675 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:51.090040 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:51.089760 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:52.695708 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:52.695645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:52.696178 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:52.695847 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:52.696178 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:52.695949 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs podName:be6b2313-a857-46d8-8b0d-adbd4a48cb9d nodeName:}" failed. No retries permitted until 2026-04-23 14:58:24.695926005 +0000 UTC m=+65.203008374 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs") pod "network-metrics-daemon-bjpzv" (UID: "be6b2313-a857-46d8-8b0d-adbd4a48cb9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 14:57:52.796129 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:52.796094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm97q\" (UniqueName: \"kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q\") pod \"network-check-target-6rls8\" (UID: \"3236e428-70b1-4400-9f33-348489a945df\") " pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:52.796301 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:52.796273 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 14:57:52.796301 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:52.796298 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 14:57:52.796372 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:52.796312 2577 projected.go:194] Error preparing data for projected volume kube-api-access-rm97q for pod openshift-network-diagnostics/network-check-target-6rls8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:52.796426 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:52.796372 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q podName:3236e428-70b1-4400-9f33-348489a945df nodeName:}" failed. No retries permitted until 2026-04-23 14:58:24.796352869 +0000 UTC m=+65.303435237 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rm97q" (UniqueName: "kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q") pod "network-check-target-6rls8" (UID: "3236e428-70b1-4400-9f33-348489a945df") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 14:57:53.089829 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:53.089740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:53.089829 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:53.089777 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:53.090124 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:53.089867 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:53.090124 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:53.089997 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:55.090077 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:55.090050 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:55.090624 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:55.090050 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:55.090624 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:55.090175 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:55.090624 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:55.090236 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:55.437518 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:55.437481 2577 generic.go:358] "Generic (PLEG): container finished" podID="82866f79-d634-4792-b3fd-ef2753feb90f" containerID="a0aa876fe1e94f89c42b78755379c6a333b2cab5fe25dd350b5ac0bcc5dbe8ff" exitCode=0 Apr 23 14:57:55.437663 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:55.437524 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jhxf7" event={"ID":"82866f79-d634-4792-b3fd-ef2753feb90f","Type":"ContainerDied","Data":"a0aa876fe1e94f89c42b78755379c6a333b2cab5fe25dd350b5ac0bcc5dbe8ff"} Apr 23 14:57:56.359492 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:56.359460 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bjpzv"] Apr 23 14:57:56.360202 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:56.359607 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:56.360202 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:56.359731 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:56.362437 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:56.362412 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6rls8"] Apr 23 14:57:56.362556 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:56.362517 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:56.362614 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:56.362596 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:56.442117 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:56.442084 2577 generic.go:358] "Generic (PLEG): container finished" podID="82866f79-d634-4792-b3fd-ef2753feb90f" containerID="514131f09992532234915a7c429c2e51906ae73838b4474e063743ed8f8aa19d" exitCode=0 Apr 23 14:57:56.442309 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:56.442131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jhxf7" event={"ID":"82866f79-d634-4792-b3fd-ef2753feb90f","Type":"ContainerDied","Data":"514131f09992532234915a7c429c2e51906ae73838b4474e063743ed8f8aa19d"} Apr 23 14:57:57.447279 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:57.447032 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jhxf7" event={"ID":"82866f79-d634-4792-b3fd-ef2753feb90f","Type":"ContainerStarted","Data":"9fd89bad13d2302c2713798cf48522e800deedf664666e71ef53822b4177b2a2"} Apr 23 14:57:57.482501 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:57.482449 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jhxf7" podStartSLOduration=4.516651058 podStartE2EDuration="37.482434195s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="2026-04-23 14:57:21.358156784 +0000 UTC m=+1.865239148" lastFinishedPulling="2026-04-23 14:57:54.323939922 +0000 UTC m=+34.831022285" observedRunningTime="2026-04-23 14:57:57.481993458 +0000 UTC m=+37.989075843" watchObservedRunningTime="2026-04-23 14:57:57.482434195 +0000 UTC m=+37.989516581" Apr 23 14:57:58.090046 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:58.090018 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:57:58.090224 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:58.090025 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:57:58.090224 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:58.090130 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6rls8" podUID="3236e428-70b1-4400-9f33-348489a945df" Apr 23 14:57:58.090358 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:57:58.090243 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bjpzv" podUID="be6b2313-a857-46d8-8b0d-adbd4a48cb9d" Apr 23 14:57:59.868509 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.868481 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeReady" Apr 23 14:57:59.868958 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.868588 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 14:57:59.903332 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.903301 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-744fbcdbfd-v2rps"] Apr 23 14:57:59.926181 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.926153 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-d9zxh"] Apr 23 14:57:59.926365 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.926327 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:57:59.928823 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.928791 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 14:57:59.928941 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.928912 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 14:57:59.928941 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.928913 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dxrvj\"" Apr 23 14:57:59.929040 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.928917 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 14:57:59.934939 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.934886 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 14:57:59.942157 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.942132 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-744fbcdbfd-v2rps"] Apr 23 14:57:59.942285 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.942165 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jvjqc"] Apr 23 14:57:59.942285 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.942272 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d9zxh" Apr 23 14:57:59.944800 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.944779 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 14:57:59.944902 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.944779 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zm2h4\"" Apr 23 14:57:59.944902 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.944779 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 14:57:59.956823 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.956802 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jvjqc"] Apr 23 14:57:59.956823 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.956827 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d9zxh"] Apr 23 14:57:59.956970 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.956933 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jvjqc" Apr 23 14:57:59.959274 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.959251 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 14:57:59.959371 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.959293 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 14:57:59.959371 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.959312 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nnkvd\"" Apr 23 14:57:59.959371 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:57:59.959260 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 14:58:00.048944 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.048906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/938f7023-4fea-4435-b684-b2a0a3193583-cert\") pod \"ingress-canary-jvjqc\" (UID: \"938f7023-4fea-4435-b684-b2a0a3193583\") " pod="openshift-ingress-canary/ingress-canary-jvjqc" Apr 23 14:58:00.048944 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.048941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-metrics-tls\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.049181 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.048962 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22wq4\" (UniqueName: \"kubernetes.io/projected/938f7023-4fea-4435-b684-b2a0a3193583-kube-api-access-22wq4\") pod \"ingress-canary-jvjqc\" (UID: \"938f7023-4fea-4435-b684-b2a0a3193583\") " pod="openshift-ingress-canary/ingress-canary-jvjqc" Apr 23 14:58:00.049181 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.048981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/432ca574-ef9a-4480-b470-c447442aa39a-trusted-ca\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.049181 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.048997 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/432ca574-ef9a-4480-b470-c447442aa39a-installation-pull-secrets\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.049181 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.049018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-tmp-dir\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.049181 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.049041 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/432ca574-ef9a-4480-b470-c447442aa39a-image-registry-private-configuration\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.049181 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.049084 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-229r8\" (UniqueName: \"kubernetes.io/projected/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-kube-api-access-229r8\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.049181 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.049112 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/432ca574-ef9a-4480-b470-c447442aa39a-registry-certificates\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.049181 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.049141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6m5k\" (UniqueName: \"kubernetes.io/projected/432ca574-ef9a-4480-b470-c447442aa39a-kube-api-access-d6m5k\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.049181 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.049159 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/432ca574-ef9a-4480-b470-c447442aa39a-ca-trust-extracted\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.049458 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.049197 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-config-volume\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.049458 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.049234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/432ca574-ef9a-4480-b470-c447442aa39a-bound-sa-token\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.049458 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.049265 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/432ca574-ef9a-4480-b470-c447442aa39a-registry-tls\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.090355 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.090319 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:58:00.090546 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.090524 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:58:00.093520 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.093496 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-shvg2\"" Apr 23 14:58:00.093641 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.093572 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 14:58:00.093641 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.093581 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 14:58:00.093737 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.093718 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 14:58:00.093975 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.093961 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qqjqd\"" Apr 23 14:58:00.150392 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/432ca574-ef9a-4480-b470-c447442aa39a-bound-sa-token\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.150541 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/432ca574-ef9a-4480-b470-c447442aa39a-registry-tls\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.150541 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150421 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/938f7023-4fea-4435-b684-b2a0a3193583-cert\") pod \"ingress-canary-jvjqc\" (UID: \"938f7023-4fea-4435-b684-b2a0a3193583\") " pod="openshift-ingress-canary/ingress-canary-jvjqc" Apr 23 14:58:00.150541 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-metrics-tls\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.150541 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150458 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22wq4\" (UniqueName: \"kubernetes.io/projected/938f7023-4fea-4435-b684-b2a0a3193583-kube-api-access-22wq4\") pod \"ingress-canary-jvjqc\" (UID: \"938f7023-4fea-4435-b684-b2a0a3193583\") " pod="openshift-ingress-canary/ingress-canary-jvjqc" Apr 23 14:58:00.150770 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/432ca574-ef9a-4480-b470-c447442aa39a-trusted-ca\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.150770 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150641 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/432ca574-ef9a-4480-b470-c447442aa39a-installation-pull-secrets\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.150878 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-tmp-dir\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.150878 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/432ca574-ef9a-4480-b470-c447442aa39a-image-registry-private-configuration\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.150976 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-229r8\" (UniqueName: \"kubernetes.io/projected/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-kube-api-access-229r8\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.150976 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150948 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/432ca574-ef9a-4480-b470-c447442aa39a-registry-certificates\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.151069 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.150998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6m5k\" (UniqueName: \"kubernetes.io/projected/432ca574-ef9a-4480-b470-c447442aa39a-kube-api-access-d6m5k\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.151069 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.151033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/432ca574-ef9a-4480-b470-c447442aa39a-ca-trust-extracted\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.151069 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.151040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-tmp-dir\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.151069 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.151058 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-config-volume\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.151531 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.151509 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-config-volume\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.151630 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.151517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/432ca574-ef9a-4480-b470-c447442aa39a-ca-trust-extracted\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.152047 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.152025 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/432ca574-ef9a-4480-b470-c447442aa39a-registry-certificates\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.152260 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.152235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/432ca574-ef9a-4480-b470-c447442aa39a-trusted-ca\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.155082 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.155059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-metrics-tls\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.155158 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.155134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/432ca574-ef9a-4480-b470-c447442aa39a-image-registry-private-configuration\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.155210 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.155179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/432ca574-ef9a-4480-b470-c447442aa39a-installation-pull-secrets\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.155247 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.155202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/432ca574-ef9a-4480-b470-c447442aa39a-registry-tls\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.155247 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.155217 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/938f7023-4fea-4435-b684-b2a0a3193583-cert\") pod \"ingress-canary-jvjqc\" (UID: \"938f7023-4fea-4435-b684-b2a0a3193583\") " pod="openshift-ingress-canary/ingress-canary-jvjqc" Apr 23 14:58:00.160332 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.160310 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/432ca574-ef9a-4480-b470-c447442aa39a-bound-sa-token\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.160605 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.160585 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-229r8\" (UniqueName: \"kubernetes.io/projected/b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea-kube-api-access-229r8\") pod \"dns-default-d9zxh\" (UID: \"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea\") " pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.160870 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.160853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6m5k\" (UniqueName: \"kubernetes.io/projected/432ca574-ef9a-4480-b470-c447442aa39a-kube-api-access-d6m5k\") pod \"image-registry-744fbcdbfd-v2rps\" (UID: \"432ca574-ef9a-4480-b470-c447442aa39a\") " pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.172883 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.172859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22wq4\" (UniqueName: \"kubernetes.io/projected/938f7023-4fea-4435-b684-b2a0a3193583-kube-api-access-22wq4\") pod \"ingress-canary-jvjqc\" (UID: \"938f7023-4fea-4435-b684-b2a0a3193583\") " pod="openshift-ingress-canary/ingress-canary-jvjqc" Apr 23 14:58:00.237807 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.237767 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:00.251657 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.251626 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:00.264279 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.264232 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jvjqc" Apr 23 14:58:00.428252 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.427555 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-744fbcdbfd-v2rps"] Apr 23 14:58:00.429943 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.429918 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d9zxh"] Apr 23 14:58:00.430902 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:58:00.430878 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432ca574_ef9a_4480_b470_c447442aa39a.slice/crio-99ee629ed39143e0644ee38c75d7d2eeaaef8cf72353654d6adfae3777a438c0 WatchSource:0}: Error finding container 99ee629ed39143e0644ee38c75d7d2eeaaef8cf72353654d6adfae3777a438c0: Status 404 returned error can't find the container with id 99ee629ed39143e0644ee38c75d7d2eeaaef8cf72353654d6adfae3777a438c0 Apr 23 14:58:00.433052 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:58:00.432973 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb24ca5ef_2f02_4ddb_9e88_c2eae3af67ea.slice/crio-77b21dd46d4a7e34d67ee895a73c380806db27822913ab51b367b2d3ac1df219 WatchSource:0}: Error finding container 77b21dd46d4a7e34d67ee895a73c380806db27822913ab51b367b2d3ac1df219: Status 404 returned error can't find the container with id 77b21dd46d4a7e34d67ee895a73c380806db27822913ab51b367b2d3ac1df219 Apr 23 14:58:00.434118 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.434091 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jvjqc"] Apr 23 14:58:00.439186 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:58:00.439168 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod938f7023_4fea_4435_b684_b2a0a3193583.slice/crio-6685c3c97f8e71dea0caeeb4fb89c8ba27728f21cf8b2ce5a9ce5f21c509fba8 WatchSource:0}: Error finding container 6685c3c97f8e71dea0caeeb4fb89c8ba27728f21cf8b2ce5a9ce5f21c509fba8: Status 404 returned error can't find the container with id 6685c3c97f8e71dea0caeeb4fb89c8ba27728f21cf8b2ce5a9ce5f21c509fba8 Apr 23 14:58:00.452946 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.452909 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jvjqc" event={"ID":"938f7023-4fea-4435-b684-b2a0a3193583","Type":"ContainerStarted","Data":"6685c3c97f8e71dea0caeeb4fb89c8ba27728f21cf8b2ce5a9ce5f21c509fba8"} Apr 23 14:58:00.453863 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.453843 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d9zxh" event={"ID":"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea","Type":"ContainerStarted","Data":"77b21dd46d4a7e34d67ee895a73c380806db27822913ab51b367b2d3ac1df219"} Apr 23 14:58:00.454711 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:00.454676 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" event={"ID":"432ca574-ef9a-4480-b470-c447442aa39a","Type":"ContainerStarted","Data":"99ee629ed39143e0644ee38c75d7d2eeaaef8cf72353654d6adfae3777a438c0"} Apr 23 14:58:01.458844 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:01.458806 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" event={"ID":"432ca574-ef9a-4480-b470-c447442aa39a","Type":"ContainerStarted","Data":"9cb064d47ab4984e6ed4a63e0d93653c51888ef1588b63713ba3ce84365b7122"} Apr 23 14:58:01.459625 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:01.459279 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:01.483943 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:01.483880 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" podStartSLOduration=6.483860507 podStartE2EDuration="6.483860507s" podCreationTimestamp="2026-04-23 14:57:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:58:01.482988799 +0000 UTC m=+41.990071188" watchObservedRunningTime="2026-04-23 14:58:01.483860507 +0000 UTC m=+41.990942874" Apr 23 14:58:03.463980 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:03.463716 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jvjqc" event={"ID":"938f7023-4fea-4435-b684-b2a0a3193583","Type":"ContainerStarted","Data":"255f278dad63fe861ae0bc87d3bd37356b15c0bd62641f08322ae02ad6440214"} Apr 23 14:58:03.465255 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:03.465227 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d9zxh" event={"ID":"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea","Type":"ContainerStarted","Data":"c44195478e01319001fcca208062089cfd798e668d8a9723dbcbe4506543247e"} Apr 23 14:58:03.465255 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:03.465258 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d9zxh" event={"ID":"b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea","Type":"ContainerStarted","Data":"d3674d0e425530bf7d124edc9f64f77a8b0852cee2d52d1d53bd9964f06d13a6"} Apr 23 14:58:03.465421 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:03.465356 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:03.479501 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:03.479431 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jvjqc" podStartSLOduration=1.885350664 podStartE2EDuration="4.479414046s" podCreationTimestamp="2026-04-23 14:57:59 +0000 UTC" firstStartedPulling="2026-04-23 14:58:00.44095923 +0000 UTC m=+40.948041597" lastFinishedPulling="2026-04-23 14:58:03.035022612 +0000 UTC m=+43.542104979" observedRunningTime="2026-04-23 14:58:03.478613931 +0000 UTC m=+43.985696317" watchObservedRunningTime="2026-04-23 14:58:03.479414046 +0000 UTC m=+43.986496498" Apr 23 14:58:03.494935 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:03.494887 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-d9zxh" podStartSLOduration=1.900980637 podStartE2EDuration="4.494873026s" podCreationTimestamp="2026-04-23 14:57:59 +0000 UTC" firstStartedPulling="2026-04-23 14:58:00.436569916 +0000 UTC m=+40.943652284" lastFinishedPulling="2026-04-23 14:58:03.030462305 +0000 UTC m=+43.537544673" observedRunningTime="2026-04-23 14:58:03.493938254 +0000 UTC m=+44.001020651" watchObservedRunningTime="2026-04-23 14:58:03.494873026 +0000 UTC m=+44.001955411" Apr 23 14:58:05.089560 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:05.089512 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-d9zxh_b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea/dns/0.log" Apr 23 14:58:05.265347 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:05.265280 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-d9zxh_b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea/kube-rbac-proxy/0.log" Apr 23 14:58:06.273640 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:06.273615 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2hv6m_1399ad28-3a6d-4e8a-9154-bb0eafc7e101/dns-node-resolver/0.log" Apr 23 14:58:07.468984 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:07.468957 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jvjqc_938f7023-4fea-4435-b684-b2a0a3193583/serve-healthcheck-canary/0.log" Apr 23 14:58:12.032232 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.032197 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5fx2c"] Apr 23 14:58:12.037340 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.037319 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.039694 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.039673 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4ln87\"" Apr 23 14:58:12.040407 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.040384 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 14:58:12.040473 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.040457 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 14:58:12.040518 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.040480 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 14:58:12.040899 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.040882 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 14:58:12.048452 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.048429 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5fx2c"] Apr 23 14:58:12.135091 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.135054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/329f6021-c417-448e-80e7-c55ea27c6445-crio-socket\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.135091 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.135094 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/329f6021-c417-448e-80e7-c55ea27c6445-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.135291 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.135126 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/329f6021-c417-448e-80e7-c55ea27c6445-data-volume\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.135291 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.135190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/329f6021-c417-448e-80e7-c55ea27c6445-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.135291 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.135257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q27x8\" (UniqueName: \"kubernetes.io/projected/329f6021-c417-448e-80e7-c55ea27c6445-kube-api-access-q27x8\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.171032 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.171000 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf"] Apr 23 14:58:12.174147 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.174131 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.177076 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.177051 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 14:58:12.177259 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.177235 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 14:58:12.177789 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.177766 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 14:58:12.178051 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.178028 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 14:58:12.178220 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.178203 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 14:58:12.180742 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.180718 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-kpmbk\"" Apr 23 14:58:12.188038 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.188006 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf"] Apr 23 14:58:12.201266 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.201239 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-t7hbb"] Apr 23 14:58:12.204706 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.204679 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.208546 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.208519 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 14:58:12.208664 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.208554 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 14:58:12.208825 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.208810 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x6ghm\"" Apr 23 14:58:12.208906 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.208811 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 14:58:12.235614 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.235588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/329f6021-c417-448e-80e7-c55ea27c6445-crio-socket\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.235814 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.235624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/329f6021-c417-448e-80e7-c55ea27c6445-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.235814 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.235643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/329f6021-c417-448e-80e7-c55ea27c6445-data-volume\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.235814 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.235678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/329f6021-c417-448e-80e7-c55ea27c6445-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.235814 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.235741 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/329f6021-c417-448e-80e7-c55ea27c6445-crio-socket\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.235814 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.235755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q27x8\" (UniqueName: \"kubernetes.io/projected/329f6021-c417-448e-80e7-c55ea27c6445-kube-api-access-q27x8\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.236184 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.236160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/329f6021-c417-448e-80e7-c55ea27c6445-data-volume\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.236419 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.236399 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/329f6021-c417-448e-80e7-c55ea27c6445-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.239741 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.239709 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/329f6021-c417-448e-80e7-c55ea27c6445-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.247961 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.247942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q27x8\" (UniqueName: \"kubernetes.io/projected/329f6021-c417-448e-80e7-c55ea27c6445-kube-api-access-q27x8\") pod \"insights-runtime-extractor-5fx2c\" (UID: \"329f6021-c417-448e-80e7-c55ea27c6445\") " pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.337009 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.336916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/43c1c226-867a-4730-91bb-e0380cebaf6e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.337009 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.336962 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-textfile\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.337009 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.336991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-accelerators-collector-config\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.337238 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.337023 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sg4l\" (UniqueName: \"kubernetes.io/projected/94819bd0-3e06-4ceb-94d5-520933061da5-kube-api-access-9sg4l\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.337238 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.337089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94819bd0-3e06-4ceb-94d5-520933061da5-metrics-client-ca\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.337238 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.337136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/94819bd0-3e06-4ceb-94d5-520933061da5-root\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.337238 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.337159 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/43c1c226-867a-4730-91bb-e0380cebaf6e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.337238 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.337190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-wtmp\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.337238 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.337209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43c1c226-867a-4730-91bb-e0380cebaf6e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.337238 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.337234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94819bd0-3e06-4ceb-94d5-520933061da5-sys\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.337463 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.337259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-tls\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.337463 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.337319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.337463 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.337345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpv96\" (UniqueName: \"kubernetes.io/projected/43c1c226-867a-4730-91bb-e0380cebaf6e-kube-api-access-rpv96\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.347106 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.347074 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5fx2c" Apr 23 14:58:12.438457 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpv96\" (UniqueName: \"kubernetes.io/projected/43c1c226-867a-4730-91bb-e0380cebaf6e-kube-api-access-rpv96\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.438457 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/43c1c226-867a-4730-91bb-e0380cebaf6e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.438732 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-textfile\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.438732 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-accelerators-collector-config\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.438732 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sg4l\" (UniqueName: \"kubernetes.io/projected/94819bd0-3e06-4ceb-94d5-520933061da5-kube-api-access-9sg4l\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.438732 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94819bd0-3e06-4ceb-94d5-520933061da5-metrics-client-ca\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.438732 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/94819bd0-3e06-4ceb-94d5-520933061da5-root\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.438732 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/43c1c226-867a-4730-91bb-e0380cebaf6e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.438732 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-wtmp\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.438732 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:58:12.438680 2577 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 23 14:58:12.438732 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43c1c226-867a-4730-91bb-e0380cebaf6e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.438732 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438736 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94819bd0-3e06-4ceb-94d5-520933061da5-sys\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.439227 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438759 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-tls\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.439227 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:58:12.438786 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43c1c226-867a-4730-91bb-e0380cebaf6e-openshift-state-metrics-tls podName:43c1c226-867a-4730-91bb-e0380cebaf6e nodeName:}" failed. No retries permitted until 2026-04-23 14:58:12.93876053 +0000 UTC m=+53.445842909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/43c1c226-867a-4730-91bb-e0380cebaf6e-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-kh5pf" (UID: "43c1c226-867a-4730-91bb-e0380cebaf6e") : secret "openshift-state-metrics-tls" not found Apr 23 14:58:12.439227 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:58:12.438841 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 14:58:12.439227 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.439227 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:58:12.438895 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-tls podName:94819bd0-3e06-4ceb-94d5-520933061da5 nodeName:}" failed. No retries permitted until 2026-04-23 14:58:12.938879744 +0000 UTC m=+53.445962121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-tls") pod "node-exporter-t7hbb" (UID: "94819bd0-3e06-4ceb-94d5-520933061da5") : secret "node-exporter-tls" not found Apr 23 14:58:12.439227 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438915 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-textfile\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.439227 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.438941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/94819bd0-3e06-4ceb-94d5-520933061da5-root\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.439227 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.439162 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94819bd0-3e06-4ceb-94d5-520933061da5-metrics-client-ca\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.439227 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.439201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-accelerators-collector-config\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.439227 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.439224 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94819bd0-3e06-4ceb-94d5-520933061da5-sys\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.439747 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.439342 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-wtmp\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.439818 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.439768 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43c1c226-867a-4730-91bb-e0380cebaf6e-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.441855 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.441829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.441979 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.441959 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/43c1c226-867a-4730-91bb-e0380cebaf6e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.447337 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.447311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpv96\" (UniqueName: \"kubernetes.io/projected/43c1c226-867a-4730-91bb-e0380cebaf6e-kube-api-access-rpv96\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.447470 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.447377 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sg4l\" (UniqueName: \"kubernetes.io/projected/94819bd0-3e06-4ceb-94d5-520933061da5-kube-api-access-9sg4l\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.478101 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.478063 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5fx2c"] Apr 23 14:58:12.483394 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:58:12.483370 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod329f6021_c417_448e_80e7_c55ea27c6445.slice/crio-1beff3cc922021c4beace403bf554cdd39f605f2fe55f1f4ea620fd7b57bb1b3 WatchSource:0}: Error finding container 1beff3cc922021c4beace403bf554cdd39f605f2fe55f1f4ea620fd7b57bb1b3: Status 404 returned error can't find the container with id 1beff3cc922021c4beace403bf554cdd39f605f2fe55f1f4ea620fd7b57bb1b3 Apr 23 14:58:12.943657 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.943620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-tls\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.943858 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.943682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/43c1c226-867a-4730-91bb-e0380cebaf6e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:12.946163 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.946131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/94819bd0-3e06-4ceb-94d5-520933061da5-node-exporter-tls\") pod \"node-exporter-t7hbb\" (UID: \"94819bd0-3e06-4ceb-94d5-520933061da5\") " pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:12.946163 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:12.946162 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/43c1c226-867a-4730-91bb-e0380cebaf6e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kh5pf\" (UID: \"43c1c226-867a-4730-91bb-e0380cebaf6e\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:13.084964 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.084928 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" Apr 23 14:58:13.113143 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.113101 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-t7hbb" Apr 23 14:58:13.262080 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.261846 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 14:58:13.266943 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.266916 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.270258 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.270150 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 14:58:13.270258 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.270182 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7pr5v\"" Apr 23 14:58:13.270258 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.270230 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 14:58:13.270504 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.270421 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 14:58:13.270729 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.270555 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 14:58:13.270729 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.270564 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 14:58:13.270862 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.270842 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 14:58:13.270917 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.270875 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 14:58:13.270917 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.270899 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 14:58:13.271133 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.271120 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 14:58:13.280390 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.280367 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 14:58:13.298054 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.298025 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf"] Apr 23 14:58:13.302247 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:58:13.302202 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43c1c226_867a_4730_91bb_e0380cebaf6e.slice/crio-8e048ec20733983d7959180a2372ab3f4c31a088e4a0cb4be325b49e64a95194 WatchSource:0}: Error finding container 8e048ec20733983d7959180a2372ab3f4c31a088e4a0cb4be325b49e64a95194: Status 404 returned error can't find the container with id 8e048ec20733983d7959180a2372ab3f4c31a088e4a0cb4be325b49e64a95194 Apr 23 14:58:13.347210 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82bql\" (UniqueName: \"kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-kube-api-access-82bql\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347353 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347284 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-config-out\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347353 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-web-config\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347454 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347354 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347454 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347454 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347454 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347439 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-tls-assets\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347646 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347470 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347646 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-config-volume\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347646 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347593 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347646 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347859 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.347859 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.347708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448084 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448084 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82bql\" (UniqueName: \"kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-kube-api-access-82bql\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448292 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-config-out\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448292 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-web-config\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448292 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448292 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448192 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448292 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448292 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-tls-assets\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448292 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448633 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-config-volume\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448633 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448633 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448633 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.448381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.448836 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:58:13.448813 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-trusted-ca-bundle podName:934c6057-02f6-496b-b5b4-27daa3211404 nodeName:}" failed. No retries permitted until 2026-04-23 14:58:13.948789696 +0000 UTC m=+54.455872077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "934c6057-02f6-496b-b5b4-27daa3211404") : configmap references non-existent config key: ca-bundle.crt Apr 23 14:58:13.449530 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.449072 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.449530 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.449134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.452629 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.452598 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-config-volume\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.452763 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.452596 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.452966 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.452947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-config-out\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.453057 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.453001 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-tls-assets\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.453057 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.453002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.453264 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.453244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.453571 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.453547 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-web-config\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.453915 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.453898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.454573 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.454552 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.463027 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.462957 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82bql\" (UniqueName: \"kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-kube-api-access-82bql\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.469448 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.469431 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-d9zxh" Apr 23 14:58:13.486971 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.486939 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5fx2c" event={"ID":"329f6021-c417-448e-80e7-c55ea27c6445","Type":"ContainerStarted","Data":"a418d51cf618eaf9350cac7f8d72601375ef9e63fd4f3b5cfdbbde2124d9e8cd"} Apr 23 14:58:13.486971 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.486973 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5fx2c" event={"ID":"329f6021-c417-448e-80e7-c55ea27c6445","Type":"ContainerStarted","Data":"50b8f8b97a0556c655bb561144fb658b7f55fcce605d83438cf5cdcbfa77132f"} Apr 23 14:58:13.487132 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.486986 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5fx2c" event={"ID":"329f6021-c417-448e-80e7-c55ea27c6445","Type":"ContainerStarted","Data":"1beff3cc922021c4beace403bf554cdd39f605f2fe55f1f4ea620fd7b57bb1b3"} Apr 23 14:58:13.487951 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.487926 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t7hbb" event={"ID":"94819bd0-3e06-4ceb-94d5-520933061da5","Type":"ContainerStarted","Data":"c586dc0cd8115e0983f1012bf5b38ffdc77a200ddaeeca975edfbeaf826e1348"} Apr 23 14:58:13.489429 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.489410 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" event={"ID":"43c1c226-867a-4730-91bb-e0380cebaf6e","Type":"ContainerStarted","Data":"b5735a5c86339470febd76a34f9c75e0535852269bbeadffe458a3102d787534"} Apr 23 14:58:13.489506 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.489432 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" event={"ID":"43c1c226-867a-4730-91bb-e0380cebaf6e","Type":"ContainerStarted","Data":"1f28a9ff761b66da681588e5f6cd8e3fd964d24a370abec2ce7bd14f10393f95"} Apr 23 14:58:13.489506 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.489441 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" event={"ID":"43c1c226-867a-4730-91bb-e0380cebaf6e","Type":"ContainerStarted","Data":"8e048ec20733983d7959180a2372ab3f4c31a088e4a0cb4be325b49e64a95194"} Apr 23 14:58:13.952077 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.952041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:13.952867 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:13.952839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:14.177468 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:14.177431 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:58:14.342071 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:14.341823 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 14:58:14.346851 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:58:14.346817 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod934c6057_02f6_496b_b5b4_27daa3211404.slice/crio-cb35df93cb8ea699c83fcab9ab4497888d424feeabf714f25eda1f5c63210254 WatchSource:0}: Error finding container cb35df93cb8ea699c83fcab9ab4497888d424feeabf714f25eda1f5c63210254: Status 404 returned error can't find the container with id cb35df93cb8ea699c83fcab9ab4497888d424feeabf714f25eda1f5c63210254 Apr 23 14:58:14.493064 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:14.492983 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerStarted","Data":"cb35df93cb8ea699c83fcab9ab4497888d424feeabf714f25eda1f5c63210254"} Apr 23 14:58:14.494668 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:14.494639 2577 generic.go:358] "Generic (PLEG): container finished" podID="94819bd0-3e06-4ceb-94d5-520933061da5" containerID="fad094902dd902654eb44b3ec590358a3bb4bd6d4bf69cf650b957973c360fca" exitCode=0 Apr 23 14:58:14.494813 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:14.494693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t7hbb" event={"ID":"94819bd0-3e06-4ceb-94d5-520933061da5","Type":"ContainerDied","Data":"fad094902dd902654eb44b3ec590358a3bb4bd6d4bf69cf650b957973c360fca"} Apr 23 14:58:15.499759 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:15.499715 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5fx2c" event={"ID":"329f6021-c417-448e-80e7-c55ea27c6445","Type":"ContainerStarted","Data":"40104626ffca42d6df8d20c5a534034716e37574b4c5eb43ee90e85cf357607a"} Apr 23 14:58:15.501857 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:15.501820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t7hbb" event={"ID":"94819bd0-3e06-4ceb-94d5-520933061da5","Type":"ContainerStarted","Data":"0d369741c05e62c697169e4f1524da1d9d944ac1bb88920716eda0f9bddcc966"} Apr 23 14:58:15.501989 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:15.501864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-t7hbb" event={"ID":"94819bd0-3e06-4ceb-94d5-520933061da5","Type":"ContainerStarted","Data":"caf8c06ad597ad778d13dd40845bb680a416d5e12d1894c31c7d41005211fd7b"} Apr 23 14:58:15.503941 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:15.503909 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" event={"ID":"43c1c226-867a-4730-91bb-e0380cebaf6e","Type":"ContainerStarted","Data":"ec296212957a9620abdfed022325056b80526826b2c0a61533245870f14dcec7"} Apr 23 14:58:15.521232 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:15.521174 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5fx2c" podStartSLOduration=0.922934168 podStartE2EDuration="3.521158824s" podCreationTimestamp="2026-04-23 14:58:12 +0000 UTC" firstStartedPulling="2026-04-23 14:58:12.537323542 +0000 UTC m=+53.044405907" lastFinishedPulling="2026-04-23 14:58:15.135548196 +0000 UTC m=+55.642630563" observedRunningTime="2026-04-23 14:58:15.520280835 +0000 UTC m=+56.027363232" watchObservedRunningTime="2026-04-23 14:58:15.521158824 +0000 UTC m=+56.028241210" Apr 23 14:58:15.545730 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:15.545652 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-t7hbb" podStartSLOduration=2.7255412 podStartE2EDuration="3.54563441s" podCreationTimestamp="2026-04-23 14:58:12 +0000 UTC" firstStartedPulling="2026-04-23 14:58:13.176876135 +0000 UTC m=+53.683958503" lastFinishedPulling="2026-04-23 14:58:13.996969349 +0000 UTC m=+54.504051713" observedRunningTime="2026-04-23 14:58:15.54482927 +0000 UTC m=+56.051911667" watchObservedRunningTime="2026-04-23 14:58:15.54563441 +0000 UTC m=+56.052716797" Apr 23 14:58:15.563615 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:15.563549 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kh5pf" podStartSLOduration=1.848840589 podStartE2EDuration="3.563534117s" podCreationTimestamp="2026-04-23 14:58:12 +0000 UTC" firstStartedPulling="2026-04-23 14:58:13.42086034 +0000 UTC m=+53.927942704" lastFinishedPulling="2026-04-23 14:58:15.135553856 +0000 UTC m=+55.642636232" observedRunningTime="2026-04-23 14:58:15.56224413 +0000 UTC m=+56.069326498" watchObservedRunningTime="2026-04-23 14:58:15.563534117 +0000 UTC m=+56.070616525" Apr 23 14:58:16.508171 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.508140 2577 generic.go:358] "Generic (PLEG): container finished" podID="934c6057-02f6-496b-b5b4-27daa3211404" containerID="85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9" exitCode=0 Apr 23 14:58:16.508647 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.508242 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerDied","Data":"85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9"} Apr 23 14:58:16.549749 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.549714 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5c59f89644-9dzhs"] Apr 23 14:58:16.554156 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.554139 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.556715 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.556677 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 14:58:16.556868 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.556735 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 14:58:16.556935 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.556869 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-ctruoi7eo053\"" Apr 23 14:58:16.556935 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.556873 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 14:58:16.557030 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.556977 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 14:58:16.557190 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.557171 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-5tgkl\"" Apr 23 14:58:16.565932 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.565909 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5c59f89644-9dzhs"] Apr 23 14:58:16.580382 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.580345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-audit-log\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.580541 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.580429 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-metrics-server-audit-profiles\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.580611 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.580548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-client-ca-bundle\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.580666 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.580607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-secret-metrics-server-tls\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.580862 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.580826 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwhfl\" (UniqueName: \"kubernetes.io/projected/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-kube-api-access-pwhfl\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.580969 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.580952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-secret-metrics-server-client-certs\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.581075 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.581059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.682294 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.682253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-secret-metrics-server-tls\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.682494 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.682418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwhfl\" (UniqueName: \"kubernetes.io/projected/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-kube-api-access-pwhfl\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.682494 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.682463 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-secret-metrics-server-client-certs\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.682612 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.682496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.682612 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.682561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-audit-log\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.682612 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.682599 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-metrics-server-audit-profiles\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.682797 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.682642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-client-ca-bundle\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.683010 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.682986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-audit-log\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.683277 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.683258 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.683778 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.683754 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-metrics-server-audit-profiles\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.684984 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.684958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-secret-metrics-server-client-certs\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.685107 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.685088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-secret-metrics-server-tls\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.685160 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.685129 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-client-ca-bundle\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.691075 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.691056 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwhfl\" (UniqueName: \"kubernetes.io/projected/97d7f10c-9cd4-4ae9-9636-cb7586b6f490-kube-api-access-pwhfl\") pod \"metrics-server-5c59f89644-9dzhs\" (UID: \"97d7f10c-9cd4-4ae9-9636-cb7586b6f490\") " pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.863443 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.863340 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:16.874024 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.873974 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw"] Apr 23 14:58:16.878049 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.878025 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw" Apr 23 14:58:16.880282 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.880251 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 14:58:16.880551 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.880347 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-mb8zp\"" Apr 23 14:58:16.886414 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.886372 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw"] Apr 23 14:58:16.984376 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.984342 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fb6691a-6e97-45ad-a1d2-d761e8d9f27f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-87wcw\" (UID: \"3fb6691a-6e97-45ad-a1d2-d761e8d9f27f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw" Apr 23 14:58:16.988403 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:16.988376 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5c59f89644-9dzhs"] Apr 23 14:58:16.991947 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:58:16.991915 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d7f10c_9cd4_4ae9_9636_cb7586b6f490.slice/crio-4b09a5e980a641427e0f63b533f483d61a9a871fee9122dc5cd1ade7cd1775a8 WatchSource:0}: Error finding container 4b09a5e980a641427e0f63b533f483d61a9a871fee9122dc5cd1ade7cd1775a8: Status 404 returned error can't find the container with id 4b09a5e980a641427e0f63b533f483d61a9a871fee9122dc5cd1ade7cd1775a8 Apr 23 14:58:17.085632 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:17.085596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fb6691a-6e97-45ad-a1d2-d761e8d9f27f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-87wcw\" (UID: \"3fb6691a-6e97-45ad-a1d2-d761e8d9f27f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw" Apr 23 14:58:17.088382 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:17.088356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fb6691a-6e97-45ad-a1d2-d761e8d9f27f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-87wcw\" (UID: \"3fb6691a-6e97-45ad-a1d2-d761e8d9f27f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw" Apr 23 14:58:17.190810 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:17.190773 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw" Apr 23 14:58:17.325872 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:17.325814 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw"] Apr 23 14:58:17.328522 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:58:17.328495 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fb6691a_6e97_45ad_a1d2_d761e8d9f27f.slice/crio-a3a1ac1de32cd743f5f25c1a91f78b9317f529b384d179ad48fc4bb067a8067c WatchSource:0}: Error finding container a3a1ac1de32cd743f5f25c1a91f78b9317f529b384d179ad48fc4bb067a8067c: Status 404 returned error can't find the container with id a3a1ac1de32cd743f5f25c1a91f78b9317f529b384d179ad48fc4bb067a8067c Apr 23 14:58:17.514459 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:17.514364 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw" event={"ID":"3fb6691a-6e97-45ad-a1d2-d761e8d9f27f","Type":"ContainerStarted","Data":"a3a1ac1de32cd743f5f25c1a91f78b9317f529b384d179ad48fc4bb067a8067c"} Apr 23 14:58:17.515638 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:17.515604 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" event={"ID":"97d7f10c-9cd4-4ae9-9636-cb7586b6f490","Type":"ContainerStarted","Data":"4b09a5e980a641427e0f63b533f483d61a9a871fee9122dc5cd1ade7cd1775a8"} Apr 23 14:58:19.435114 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.435044 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpxb2" Apr 23 14:58:19.524366 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.524335 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw" event={"ID":"3fb6691a-6e97-45ad-a1d2-d761e8d9f27f","Type":"ContainerStarted","Data":"d4621cdb1ff1778a809286597b33aaa09455843311c1ff9f76e4dcdcb78daa5c"} Apr 23 14:58:19.524718 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.524671 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw" Apr 23 14:58:19.525673 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.525644 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" event={"ID":"97d7f10c-9cd4-4ae9-9636-cb7586b6f490","Type":"ContainerStarted","Data":"61a1f7c387495ed387e04d1bbb22e3d52b7013483ecab41d5124b5128adf5bcc"} Apr 23 14:58:19.529748 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.529723 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerStarted","Data":"24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e"} Apr 23 14:58:19.529848 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.529756 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerStarted","Data":"486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210"} Apr 23 14:58:19.529848 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.529768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerStarted","Data":"bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4"} Apr 23 14:58:19.529848 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.529779 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerStarted","Data":"9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e"} Apr 23 14:58:19.529848 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.529790 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerStarted","Data":"0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b"} Apr 23 14:58:19.531599 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.531579 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw" Apr 23 14:58:19.544488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.544436 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-87wcw" podStartSLOduration=1.6016519919999999 podStartE2EDuration="3.544423718s" podCreationTimestamp="2026-04-23 14:58:16 +0000 UTC" firstStartedPulling="2026-04-23 14:58:17.330747733 +0000 UTC m=+57.837830098" lastFinishedPulling="2026-04-23 14:58:19.273519445 +0000 UTC m=+59.780601824" observedRunningTime="2026-04-23 14:58:19.543635862 +0000 UTC m=+60.050718250" watchObservedRunningTime="2026-04-23 14:58:19.544423718 +0000 UTC m=+60.051506082" Apr 23 14:58:19.566514 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.566454 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" podStartSLOduration=1.332964485 podStartE2EDuration="3.566432957s" podCreationTimestamp="2026-04-23 14:58:16 +0000 UTC" firstStartedPulling="2026-04-23 14:58:16.993840362 +0000 UTC m=+57.500922730" lastFinishedPulling="2026-04-23 14:58:19.22730883 +0000 UTC m=+59.734391202" observedRunningTime="2026-04-23 14:58:19.566341693 +0000 UTC m=+60.073424080" watchObservedRunningTime="2026-04-23 14:58:19.566432957 +0000 UTC m=+60.073515348" Apr 23 14:58:19.797866 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.797785 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78c5b7b795-rswtk"] Apr 23 14:58:19.801126 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.801103 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:19.804219 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.804187 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 14:58:19.804397 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.804187 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-mmzjs\"" Apr 23 14:58:19.805647 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.805623 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 14:58:19.805647 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.805636 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 14:58:19.805856 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.805662 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 14:58:19.805856 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.805772 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 14:58:19.806637 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.806621 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 14:58:19.809207 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.809119 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 14:58:19.811382 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.811358 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 14:58:19.812764 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.812740 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78c5b7b795-rswtk"] Apr 23 14:58:19.912164 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.912128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-oauth-config\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:19.912308 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.912177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-trusted-ca-bundle\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:19.912308 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.912257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-service-ca\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:19.912308 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.912281 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-oauth-serving-cert\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:19.912308 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.912301 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-console-config\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:19.912447 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.912393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-serving-cert\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:19.912447 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:19.912430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts7rp\" (UniqueName: \"kubernetes.io/projected/d762ca36-74c1-4055-b0c8-bda608d1c686-kube-api-access-ts7rp\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.013152 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.013116 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-oauth-config\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.013152 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.013156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-trusted-ca-bundle\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.013402 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.013206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-service-ca\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.013402 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.013228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-oauth-serving-cert\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.013402 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.013247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-console-config\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.013402 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.013303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-serving-cert\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.013402 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.013319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ts7rp\" (UniqueName: \"kubernetes.io/projected/d762ca36-74c1-4055-b0c8-bda608d1c686-kube-api-access-ts7rp\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.015865 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.015835 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 14:58:20.015991 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.015893 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 14:58:20.015991 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.015900 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 14:58:20.016120 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.016010 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 14:58:20.016528 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.016499 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 14:58:20.021171 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.021149 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 14:58:20.022637 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.022616 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 14:58:20.024146 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.024122 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-service-ca\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.024279 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.024256 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-console-config\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.024348 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.024258 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-trusted-ca-bundle\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.024594 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.024573 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-oauth-serving-cert\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.026317 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.026295 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-oauth-config\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.026802 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.026776 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-serving-cert\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.032822 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.032798 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 14:58:20.043551 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.043518 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts7rp\" (UniqueName: \"kubernetes.io/projected/d762ca36-74c1-4055-b0c8-bda608d1c686-kube-api-access-ts7rp\") pod \"console-78c5b7b795-rswtk\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.114401 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.114326 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-mmzjs\"" Apr 23 14:58:20.121734 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.121714 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:20.257432 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.257401 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78c5b7b795-rswtk"] Apr 23 14:58:20.261132 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:58:20.261100 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd762ca36_74c1_4055_b0c8_bda608d1c686.slice/crio-3b4c0fd98785f509167afc0b920b17f6c31af9792e14a4cb252951a8624d7c4a WatchSource:0}: Error finding container 3b4c0fd98785f509167afc0b920b17f6c31af9792e14a4cb252951a8624d7c4a: Status 404 returned error can't find the container with id 3b4c0fd98785f509167afc0b920b17f6c31af9792e14a4cb252951a8624d7c4a Apr 23 14:58:20.534841 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.534808 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerStarted","Data":"f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5"} Apr 23 14:58:20.535839 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.535816 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78c5b7b795-rswtk" event={"ID":"d762ca36-74c1-4055-b0c8-bda608d1c686","Type":"ContainerStarted","Data":"3b4c0fd98785f509167afc0b920b17f6c31af9792e14a4cb252951a8624d7c4a"} Apr 23 14:58:20.565583 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:20.565528 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.443254912 podStartE2EDuration="7.565512863s" podCreationTimestamp="2026-04-23 14:58:13 +0000 UTC" firstStartedPulling="2026-04-23 14:58:14.349393268 +0000 UTC m=+54.856475635" lastFinishedPulling="2026-04-23 14:58:20.471651217 +0000 UTC m=+60.978733586" observedRunningTime="2026-04-23 14:58:20.56365563 +0000 UTC m=+61.070738039" watchObservedRunningTime="2026-04-23 14:58:20.565512863 +0000 UTC m=+61.072595249" Apr 23 14:58:22.465447 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:22.465406 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-744fbcdbfd-v2rps" Apr 23 14:58:23.546861 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:23.546823 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78c5b7b795-rswtk" event={"ID":"d762ca36-74c1-4055-b0c8-bda608d1c686","Type":"ContainerStarted","Data":"f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6"} Apr 23 14:58:23.565817 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:23.565769 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78c5b7b795-rswtk" podStartSLOduration=2.023461824 podStartE2EDuration="4.565754574s" podCreationTimestamp="2026-04-23 14:58:19 +0000 UTC" firstStartedPulling="2026-04-23 14:58:20.263267875 +0000 UTC m=+60.770350242" lastFinishedPulling="2026-04-23 14:58:22.805560627 +0000 UTC m=+63.312642992" observedRunningTime="2026-04-23 14:58:23.56414133 +0000 UTC m=+64.071223713" watchObservedRunningTime="2026-04-23 14:58:23.565754574 +0000 UTC m=+64.072836960" Apr 23 14:58:24.755476 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:24.755430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:58:24.758027 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:24.758004 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 14:58:24.767949 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:24.767922 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be6b2313-a857-46d8-8b0d-adbd4a48cb9d-metrics-certs\") pod \"network-metrics-daemon-bjpzv\" (UID: \"be6b2313-a857-46d8-8b0d-adbd4a48cb9d\") " pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:58:24.856087 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:24.856045 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm97q\" (UniqueName: \"kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q\") pod \"network-check-target-6rls8\" (UID: \"3236e428-70b1-4400-9f33-348489a945df\") " pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:58:24.858645 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:24.858612 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 14:58:24.868964 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:24.868939 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 14:58:24.879369 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:24.879347 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm97q\" (UniqueName: \"kubernetes.io/projected/3236e428-70b1-4400-9f33-348489a945df-kube-api-access-rm97q\") pod \"network-check-target-6rls8\" (UID: \"3236e428-70b1-4400-9f33-348489a945df\") " pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:58:25.002205 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:25.002167 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-shvg2\"" Apr 23 14:58:25.007796 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:25.007730 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qqjqd\"" Apr 23 14:58:25.010597 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:25.010581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:58:25.015308 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:25.015291 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bjpzv" Apr 23 14:58:25.141612 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:25.141571 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6rls8"] Apr 23 14:58:25.146460 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:58:25.146425 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3236e428_70b1_4400_9f33_348489a945df.slice/crio-2fdaccfc5689ecbed89972bc85e4abf918fa4739e36460fa30aa89dcb0788c66 WatchSource:0}: Error finding container 2fdaccfc5689ecbed89972bc85e4abf918fa4739e36460fa30aa89dcb0788c66: Status 404 returned error can't find the container with id 2fdaccfc5689ecbed89972bc85e4abf918fa4739e36460fa30aa89dcb0788c66 Apr 23 14:58:25.158963 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:25.158866 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bjpzv"] Apr 23 14:58:25.161661 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:58:25.161635 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe6b2313_a857_46d8_8b0d_adbd4a48cb9d.slice/crio-a8bef4eb2dcde730dcc3c2578283fbfe1c47769eda4db5bacc812892e03bad2f WatchSource:0}: Error finding container a8bef4eb2dcde730dcc3c2578283fbfe1c47769eda4db5bacc812892e03bad2f: Status 404 returned error can't find the container with id a8bef4eb2dcde730dcc3c2578283fbfe1c47769eda4db5bacc812892e03bad2f Apr 23 14:58:25.553171 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:25.553134 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bjpzv" event={"ID":"be6b2313-a857-46d8-8b0d-adbd4a48cb9d","Type":"ContainerStarted","Data":"a8bef4eb2dcde730dcc3c2578283fbfe1c47769eda4db5bacc812892e03bad2f"} Apr 23 14:58:25.554002 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:25.553979 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6rls8" event={"ID":"3236e428-70b1-4400-9f33-348489a945df","Type":"ContainerStarted","Data":"2fdaccfc5689ecbed89972bc85e4abf918fa4739e36460fa30aa89dcb0788c66"} Apr 23 14:58:27.565031 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:27.564990 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bjpzv" event={"ID":"be6b2313-a857-46d8-8b0d-adbd4a48cb9d","Type":"ContainerStarted","Data":"522a2989578c264297b8eb62ac8524a2c2e7179250c0f2d4329f02347340a37c"} Apr 23 14:58:28.569825 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:28.569780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bjpzv" event={"ID":"be6b2313-a857-46d8-8b0d-adbd4a48cb9d","Type":"ContainerStarted","Data":"df7ca033588f5a68555667986538b300b016e4c2fce6ac5b5f233d9379f75f24"} Apr 23 14:58:28.571075 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:28.571049 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6rls8" event={"ID":"3236e428-70b1-4400-9f33-348489a945df","Type":"ContainerStarted","Data":"2da3c5e95227e16f004239ee681a12b2a9d015c16f75a7940eafca56c74bfc35"} Apr 23 14:58:28.571178 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:28.571172 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:58:28.590379 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:28.590286 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bjpzv" podStartSLOduration=67.006227172 podStartE2EDuration="1m8.590267736s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="2026-04-23 14:58:25.163128133 +0000 UTC m=+65.670210500" lastFinishedPulling="2026-04-23 14:58:26.747168688 +0000 UTC m=+67.254251064" observedRunningTime="2026-04-23 14:58:28.589657724 +0000 UTC m=+69.096740110" watchObservedRunningTime="2026-04-23 14:58:28.590267736 +0000 UTC m=+69.097350121" Apr 23 14:58:28.607463 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:28.607413 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6rls8" podStartSLOduration=65.797787461 podStartE2EDuration="1m8.607395785s" podCreationTimestamp="2026-04-23 14:57:20 +0000 UTC" firstStartedPulling="2026-04-23 14:58:25.148576786 +0000 UTC m=+65.655659154" lastFinishedPulling="2026-04-23 14:58:27.958185094 +0000 UTC m=+68.465267478" observedRunningTime="2026-04-23 14:58:28.606457462 +0000 UTC m=+69.113539849" watchObservedRunningTime="2026-04-23 14:58:28.607395785 +0000 UTC m=+69.114478172" Apr 23 14:58:30.122790 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:30.122748 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:30.122790 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:30.122795 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:30.128037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:30.128014 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:30.580454 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:30.580427 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 14:58:36.863739 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:36.863679 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:36.863739 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:36.863751 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:56.868641 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:56.868605 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:56.872906 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:56.872880 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5c59f89644-9dzhs" Apr 23 14:58:59.576642 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:58:59.576612 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6rls8" Apr 23 14:59:32.322191 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.322153 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 14:59:32.322851 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.322792 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="alertmanager" containerID="cri-o://0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b" gracePeriod=120 Apr 23 14:59:32.322999 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.322865 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy-web" containerID="cri-o://bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4" gracePeriod=120 Apr 23 14:59:32.322999 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.322929 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy" containerID="cri-o://486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210" gracePeriod=120 Apr 23 14:59:32.322999 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.322865 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy-metric" containerID="cri-o://24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e" gracePeriod=120 Apr 23 14:59:32.322999 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.322947 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="prom-label-proxy" containerID="cri-o://f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5" gracePeriod=120 Apr 23 14:59:32.323261 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.323031 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="config-reloader" containerID="cri-o://9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e" gracePeriod=120 Apr 23 14:59:32.745421 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.745390 2577 generic.go:358] "Generic (PLEG): container finished" podID="934c6057-02f6-496b-b5b4-27daa3211404" containerID="f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5" exitCode=0 Apr 23 14:59:32.745421 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.745414 2577 generic.go:358] "Generic (PLEG): container finished" podID="934c6057-02f6-496b-b5b4-27daa3211404" containerID="486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210" exitCode=0 Apr 23 14:59:32.745421 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.745421 2577 generic.go:358] "Generic (PLEG): container finished" podID="934c6057-02f6-496b-b5b4-27daa3211404" containerID="9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e" exitCode=0 Apr 23 14:59:32.745421 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.745427 2577 generic.go:358] "Generic (PLEG): container finished" podID="934c6057-02f6-496b-b5b4-27daa3211404" containerID="0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b" exitCode=0 Apr 23 14:59:32.745689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.745467 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerDied","Data":"f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5"} Apr 23 14:59:32.745689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.745501 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerDied","Data":"486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210"} Apr 23 14:59:32.745689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.745514 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerDied","Data":"9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e"} Apr 23 14:59:32.745689 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:32.745524 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerDied","Data":"0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b"} Apr 23 14:59:33.562288 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.562264 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.592060 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592024 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82bql\" (UniqueName: \"kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-kube-api-access-82bql\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592231 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592073 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-web-config\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592231 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592108 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-main-tls\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592231 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592140 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-trusted-ca-bundle\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592231 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592167 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-config-volume\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592231 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592195 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-main-db\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592231 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592225 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-metric\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592512 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592265 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-cluster-tls-config\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592512 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592298 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-config-out\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592512 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592326 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-metrics-client-ca\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592512 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592384 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-tls-assets\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592512 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592432 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-web\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592512 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592463 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy\") pod \"934c6057-02f6-496b-b5b4-27daa3211404\" (UID: \"934c6057-02f6-496b-b5b4-27daa3211404\") " Apr 23 14:59:33.592918 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.592776 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:59:33.593893 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.593854 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:59:33.594571 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.594533 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:59:33.596488 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.596446 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-config-volume" (OuterVolumeSpecName: "config-volume") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:59:33.598136 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.598063 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:59:33.599484 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.599374 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:59:33.599484 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.599447 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:59:33.600904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.600865 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:59:33.600904 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.600879 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-config-out" (OuterVolumeSpecName: "config-out") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:59:33.601314 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.601279 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-kube-api-access-82bql" (OuterVolumeSpecName: "kube-api-access-82bql") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "kube-api-access-82bql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:59:33.602867 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.602841 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:59:33.605047 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.605020 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:59:33.610657 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.610631 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-web-config" (OuterVolumeSpecName: "web-config") pod "934c6057-02f6-496b-b5b4-27daa3211404" (UID: "934c6057-02f6-496b-b5b4-27daa3211404"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:59:33.693795 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693755 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-tls-assets\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.693795 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693791 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.694037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693816 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.694037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693830 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-82bql\" (UniqueName: \"kubernetes.io/projected/934c6057-02f6-496b-b5b4-27daa3211404-kube-api-access-82bql\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.694037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693845 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-web-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.694037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693859 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-main-tls\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.694037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693871 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.694037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693883 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-config-volume\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.694037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693894 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-alertmanager-main-db\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.694037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693907 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.694037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693920 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/934c6057-02f6-496b-b5b4-27daa3211404-cluster-tls-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.694037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693932 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/934c6057-02f6-496b-b5b4-27daa3211404-config-out\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.694037 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.693943 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/934c6057-02f6-496b-b5b4-27daa3211404-metrics-client-ca\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 14:59:33.750630 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.750598 2577 generic.go:358] "Generic (PLEG): container finished" podID="934c6057-02f6-496b-b5b4-27daa3211404" containerID="24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e" exitCode=0 Apr 23 14:59:33.750630 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.750628 2577 generic.go:358] "Generic (PLEG): container finished" podID="934c6057-02f6-496b-b5b4-27daa3211404" containerID="bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4" exitCode=0 Apr 23 14:59:33.750885 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.750677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerDied","Data":"24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e"} Apr 23 14:59:33.750885 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.750727 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.750885 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.750735 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerDied","Data":"bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4"} Apr 23 14:59:33.750885 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.750750 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"934c6057-02f6-496b-b5b4-27daa3211404","Type":"ContainerDied","Data":"cb35df93cb8ea699c83fcab9ab4497888d424feeabf714f25eda1f5c63210254"} Apr 23 14:59:33.750885 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.750768 2577 scope.go:117] "RemoveContainer" containerID="f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5" Apr 23 14:59:33.758549 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.758527 2577 scope.go:117] "RemoveContainer" containerID="24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e" Apr 23 14:59:33.765443 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.765423 2577 scope.go:117] "RemoveContainer" containerID="486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210" Apr 23 14:59:33.771465 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.771446 2577 scope.go:117] "RemoveContainer" containerID="bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4" Apr 23 14:59:33.777342 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.777267 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 14:59:33.778844 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.778778 2577 scope.go:117] "RemoveContainer" containerID="9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e" Apr 23 14:59:33.780233 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.780213 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 14:59:33.785275 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.785258 2577 scope.go:117] "RemoveContainer" containerID="0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b" Apr 23 14:59:33.791577 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.791561 2577 scope.go:117] "RemoveContainer" containerID="85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9" Apr 23 14:59:33.797602 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.797583 2577 scope.go:117] "RemoveContainer" containerID="f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5" Apr 23 14:59:33.797877 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:59:33.797855 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5\": container with ID starting with f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5 not found: ID does not exist" containerID="f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5" Apr 23 14:59:33.797952 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.797890 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5"} err="failed to get container status \"f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5\": rpc error: code = NotFound desc = could not find container \"f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5\": container with ID starting with f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5 not found: ID does not exist" Apr 23 14:59:33.797952 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.797936 2577 scope.go:117] "RemoveContainer" containerID="24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e" Apr 23 14:59:33.798183 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:59:33.798165 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e\": container with ID starting with 24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e not found: ID does not exist" containerID="24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e" Apr 23 14:59:33.798223 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.798189 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e"} err="failed to get container status \"24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e\": rpc error: code = NotFound desc = could not find container \"24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e\": container with ID starting with 24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e not found: ID does not exist" Apr 23 14:59:33.798223 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.798205 2577 scope.go:117] "RemoveContainer" containerID="486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210" Apr 23 14:59:33.798412 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:59:33.798398 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210\": container with ID starting with 486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210 not found: ID does not exist" containerID="486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210" Apr 23 14:59:33.798460 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.798414 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210"} err="failed to get container status \"486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210\": rpc error: code = NotFound desc = could not find container \"486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210\": container with ID starting with 486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210 not found: ID does not exist" Apr 23 14:59:33.798460 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.798425 2577 scope.go:117] "RemoveContainer" containerID="bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4" Apr 23 14:59:33.798640 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:59:33.798623 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4\": container with ID starting with bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4 not found: ID does not exist" containerID="bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4" Apr 23 14:59:33.798677 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.798646 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4"} err="failed to get container status \"bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4\": rpc error: code = NotFound desc = could not find container \"bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4\": container with ID starting with bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4 not found: ID does not exist" Apr 23 14:59:33.798677 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.798661 2577 scope.go:117] "RemoveContainer" containerID="9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e" Apr 23 14:59:33.798928 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:59:33.798911 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e\": container with ID starting with 9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e not found: ID does not exist" containerID="9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e" Apr 23 14:59:33.798985 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.798932 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e"} err="failed to get container status \"9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e\": rpc error: code = NotFound desc = could not find container \"9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e\": container with ID starting with 9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e not found: ID does not exist" Apr 23 14:59:33.798985 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.798947 2577 scope.go:117] "RemoveContainer" containerID="0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b" Apr 23 14:59:33.799166 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:59:33.799151 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b\": container with ID starting with 0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b not found: ID does not exist" containerID="0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b" Apr 23 14:59:33.799204 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.799171 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b"} err="failed to get container status \"0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b\": rpc error: code = NotFound desc = could not find container \"0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b\": container with ID starting with 0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b not found: ID does not exist" Apr 23 14:59:33.799204 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.799184 2577 scope.go:117] "RemoveContainer" containerID="85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9" Apr 23 14:59:33.799396 ip-10-0-141-16 kubenswrapper[2577]: E0423 14:59:33.799380 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9\": container with ID starting with 85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9 not found: ID does not exist" containerID="85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9" Apr 23 14:59:33.799446 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.799400 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9"} err="failed to get container status \"85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9\": rpc error: code = NotFound desc = could not find container \"85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9\": container with ID starting with 85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9 not found: ID does not exist" Apr 23 14:59:33.799446 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.799412 2577 scope.go:117] "RemoveContainer" containerID="f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5" Apr 23 14:59:33.799621 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.799604 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5"} err="failed to get container status \"f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5\": rpc error: code = NotFound desc = could not find container \"f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5\": container with ID starting with f2c4838a63d1dc0ebea11c377931a24aa35608aeb4dc76320f712ea2a28bd3c5 not found: ID does not exist" Apr 23 14:59:33.799674 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.799621 2577 scope.go:117] "RemoveContainer" containerID="24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e" Apr 23 14:59:33.799854 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.799837 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e"} err="failed to get container status \"24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e\": rpc error: code = NotFound desc = could not find container \"24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e\": container with ID starting with 24cf26aa63230e1478f842d25dae84a0a5d7f5428eb646a4727446046a9c924e not found: ID does not exist" Apr 23 14:59:33.799906 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.799855 2577 scope.go:117] "RemoveContainer" containerID="486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210" Apr 23 14:59:33.800064 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.800049 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210"} err="failed to get container status \"486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210\": rpc error: code = NotFound desc = could not find container \"486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210\": container with ID starting with 486698b340b17e43b3c9ee0483b14ae93fc2c1a428280a2ad4cb7668ef206210 not found: ID does not exist" Apr 23 14:59:33.800110 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.800065 2577 scope.go:117] "RemoveContainer" containerID="bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4" Apr 23 14:59:33.800279 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.800260 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4"} err="failed to get container status \"bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4\": rpc error: code = NotFound desc = could not find container \"bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4\": container with ID starting with bbe97b40a1e2bdd560ec54b2ed116e118ac8abcb2f855a53b0034c38296bbca4 not found: ID does not exist" Apr 23 14:59:33.800279 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.800279 2577 scope.go:117] "RemoveContainer" containerID="9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e" Apr 23 14:59:33.800514 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.800492 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e"} err="failed to get container status \"9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e\": rpc error: code = NotFound desc = could not find container \"9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e\": container with ID starting with 9040e4a13bf6fcad65d16a85b3c4dc23015a25a534b6db0251382eaeed1c673e not found: ID does not exist" Apr 23 14:59:33.800514 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.800512 2577 scope.go:117] "RemoveContainer" containerID="0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b" Apr 23 14:59:33.800737 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.800719 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b"} err="failed to get container status \"0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b\": rpc error: code = NotFound desc = could not find container \"0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b\": container with ID starting with 0536f9fc0ef75124ca9c82880250bec91a66ab1db201a69e5d70866f5563d35b not found: ID does not exist" Apr 23 14:59:33.800782 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.800738 2577 scope.go:117] "RemoveContainer" containerID="85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9" Apr 23 14:59:33.800941 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.800921 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9"} err="failed to get container status \"85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9\": rpc error: code = NotFound desc = could not find container \"85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9\": container with ID starting with 85ece71b66b527ad194670042c49d3d52d5c9623d0f45b4b2aeb2b7cc9c97fa9 not found: ID does not exist" Apr 23 14:59:33.813200 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813178 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 14:59:33.813419 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813407 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="init-config-reloader" Apr 23 14:59:33.813482 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813420 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="init-config-reloader" Apr 23 14:59:33.813482 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813429 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="alertmanager" Apr 23 14:59:33.813482 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813435 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="alertmanager" Apr 23 14:59:33.813482 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813442 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy-metric" Apr 23 14:59:33.813482 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813447 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy-metric" Apr 23 14:59:33.813482 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813457 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="prom-label-proxy" Apr 23 14:59:33.813482 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813462 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="prom-label-proxy" Apr 23 14:59:33.813482 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813470 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="config-reloader" Apr 23 14:59:33.813482 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813474 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="config-reloader" Apr 23 14:59:33.813482 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813481 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy" Apr 23 14:59:33.813482 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813486 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy" Apr 23 14:59:33.813979 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813493 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy-web" Apr 23 14:59:33.813979 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813498 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy-web" Apr 23 14:59:33.813979 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813534 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy-web" Apr 23 14:59:33.813979 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813541 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy-metric" Apr 23 14:59:33.813979 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813546 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="prom-label-proxy" Apr 23 14:59:33.813979 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813553 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="kube-rbac-proxy" Apr 23 14:59:33.813979 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813560 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="alertmanager" Apr 23 14:59:33.813979 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.813566 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="934c6057-02f6-496b-b5b4-27daa3211404" containerName="config-reloader" Apr 23 14:59:33.817758 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.817741 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.824099 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.820639 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 14:59:33.824776 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.824754 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 14:59:33.824926 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.824900 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7pr5v\"" Apr 23 14:59:33.825036 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.824779 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 14:59:33.825036 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.825003 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 14:59:33.825153 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.824808 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 14:59:33.825153 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.824814 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 14:59:33.825153 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.824812 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 14:59:33.825497 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.825483 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 14:59:33.828287 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.828269 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 14:59:33.836094 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.836070 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 14:59:33.895501 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895465 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.895501 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrwq8\" (UniqueName: \"kubernetes.io/projected/a048f109-903d-48f3-8b93-bf6c8b811e53-kube-api-access-xrwq8\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.895787 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-web-config\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.895787 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895550 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a048f109-903d-48f3-8b93-bf6c8b811e53-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.895787 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895593 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-config-volume\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.895787 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.895787 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a048f109-903d-48f3-8b93-bf6c8b811e53-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.895787 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a048f109-903d-48f3-8b93-bf6c8b811e53-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.895787 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.896033 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.896033 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a048f109-903d-48f3-8b93-bf6c8b811e53-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.896033 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895861 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a048f109-903d-48f3-8b93-bf6c8b811e53-config-out\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.896033 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.895878 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997126 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997032 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-config-volume\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997126 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997334 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997135 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a048f109-903d-48f3-8b93-bf6c8b811e53-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997334 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997175 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a048f109-903d-48f3-8b93-bf6c8b811e53-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997334 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997320 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997484 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997484 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a048f109-903d-48f3-8b93-bf6c8b811e53-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997484 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a048f109-903d-48f3-8b93-bf6c8b811e53-config-out\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997484 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997672 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997672 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrwq8\" (UniqueName: \"kubernetes.io/projected/a048f109-903d-48f3-8b93-bf6c8b811e53-kube-api-access-xrwq8\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997672 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-web-config\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997672 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a048f109-903d-48f3-8b93-bf6c8b811e53-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.997905 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997786 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a048f109-903d-48f3-8b93-bf6c8b811e53-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.998009 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.997983 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a048f109-903d-48f3-8b93-bf6c8b811e53-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:33.998225 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:33.998195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a048f109-903d-48f3-8b93-bf6c8b811e53-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.000298 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.000276 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.000464 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.000324 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-config-volume\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.000637 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.000610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a048f109-903d-48f3-8b93-bf6c8b811e53-config-out\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.000751 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.000723 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.000871 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.000796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.000871 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.000839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.000986 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.000909 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a048f109-903d-48f3-8b93-bf6c8b811e53-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.001045 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.001031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-web-config\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.001792 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.001774 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a048f109-903d-48f3-8b93-bf6c8b811e53-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.012509 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.012488 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrwq8\" (UniqueName: \"kubernetes.io/projected/a048f109-903d-48f3-8b93-bf6c8b811e53-kube-api-access-xrwq8\") pod \"alertmanager-main-0\" (UID: \"a048f109-903d-48f3-8b93-bf6c8b811e53\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.094203 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.094168 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934c6057-02f6-496b-b5b4-27daa3211404" path="/var/lib/kubelet/pods/934c6057-02f6-496b-b5b4-27daa3211404/volumes" Apr 23 14:59:34.130885 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.130851 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 14:59:34.261156 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.261071 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 14:59:34.263906 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:59:34.263877 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda048f109_903d_48f3_8b93_bf6c8b811e53.slice/crio-7fa824b44e8b0048f8e15312ba901e648785fe99336a4bd44d576a288359f087 WatchSource:0}: Error finding container 7fa824b44e8b0048f8e15312ba901e648785fe99336a4bd44d576a288359f087: Status 404 returned error can't find the container with id 7fa824b44e8b0048f8e15312ba901e648785fe99336a4bd44d576a288359f087 Apr 23 14:59:34.755655 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.755623 2577 generic.go:358] "Generic (PLEG): container finished" podID="a048f109-903d-48f3-8b93-bf6c8b811e53" containerID="b3284bc427d505f1085bc94d83831d5d4f2ed470edc85a140bbc523a49a062c4" exitCode=0 Apr 23 14:59:34.756045 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.755720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a048f109-903d-48f3-8b93-bf6c8b811e53","Type":"ContainerDied","Data":"b3284bc427d505f1085bc94d83831d5d4f2ed470edc85a140bbc523a49a062c4"} Apr 23 14:59:34.756045 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:34.755760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a048f109-903d-48f3-8b93-bf6c8b811e53","Type":"ContainerStarted","Data":"7fa824b44e8b0048f8e15312ba901e648785fe99336a4bd44d576a288359f087"} Apr 23 14:59:35.762257 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:35.762223 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a048f109-903d-48f3-8b93-bf6c8b811e53","Type":"ContainerStarted","Data":"9f2f8ff3cc25d87e10b8b6298608ab182e648844d1fd9d9b80157a0658237fca"} Apr 23 14:59:35.762257 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:35.762257 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a048f109-903d-48f3-8b93-bf6c8b811e53","Type":"ContainerStarted","Data":"57f7f9868e7c0301bfab93aee02d4b16247d292509bbc847ba3c067e157dcee4"} Apr 23 14:59:35.762656 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:35.762270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a048f109-903d-48f3-8b93-bf6c8b811e53","Type":"ContainerStarted","Data":"f270f1d3c83729b19483b2dfd8d3a751957981866b5a7a8c66a89d3c54c0c153"} Apr 23 14:59:35.762656 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:35.762279 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a048f109-903d-48f3-8b93-bf6c8b811e53","Type":"ContainerStarted","Data":"235b230c9443409431a03dd9473252fc71dcd6c5ff2a9462b5051a4b1dc38abe"} Apr 23 14:59:35.762656 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:35.762287 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a048f109-903d-48f3-8b93-bf6c8b811e53","Type":"ContainerStarted","Data":"d76745adfae1513844b5d4f8710425f591182d1a94a978d66869a16823dc4763"} Apr 23 14:59:35.762656 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:35.762310 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a048f109-903d-48f3-8b93-bf6c8b811e53","Type":"ContainerStarted","Data":"a58b4513eeefea4dc97f82c5a251250458529958310b6a0bdb8f7cab69c2209d"} Apr 23 14:59:35.812817 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:35.812765 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.812748757 podStartE2EDuration="2.812748757s" podCreationTimestamp="2026-04-23 14:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:59:35.812442036 +0000 UTC m=+136.319524462" watchObservedRunningTime="2026-04-23 14:59:35.812748757 +0000 UTC m=+136.319831143" Apr 23 14:59:36.345745 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.345691 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-855f55c85c-km4hp"] Apr 23 14:59:36.349307 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.349291 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.351839 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.351809 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 14:59:36.351967 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.351847 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 14:59:36.351967 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.351859 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 14:59:36.351967 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.351946 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 14:59:36.352138 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.352120 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 14:59:36.352212 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.352195 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-rszjn\"" Apr 23 14:59:36.359510 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.359483 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-855f55c85c-km4hp"] Apr 23 14:59:36.359677 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.359661 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 14:59:36.414758 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.414723 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/811c13a0-c536-4b9f-85e7-27c020082c98-serving-certs-ca-bundle\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.414930 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.414770 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.414930 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.414833 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/811c13a0-c536-4b9f-85e7-27c020082c98-telemeter-trusted-ca-bundle\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.414930 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.414855 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trsvv\" (UniqueName: \"kubernetes.io/projected/811c13a0-c536-4b9f-85e7-27c020082c98-kube-api-access-trsvv\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.414930 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.414910 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-telemeter-client-tls\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.415114 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.414951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-secret-telemeter-client\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.415114 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.414977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-federate-client-tls\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.415114 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.415042 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/811c13a0-c536-4b9f-85e7-27c020082c98-metrics-client-ca\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.516299 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.516251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/811c13a0-c536-4b9f-85e7-27c020082c98-serving-certs-ca-bundle\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.516299 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.516302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.516552 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.516327 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/811c13a0-c536-4b9f-85e7-27c020082c98-telemeter-trusted-ca-bundle\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.516552 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.516345 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trsvv\" (UniqueName: \"kubernetes.io/projected/811c13a0-c536-4b9f-85e7-27c020082c98-kube-api-access-trsvv\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.516552 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.516380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-telemeter-client-tls\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.516552 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.516401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-secret-telemeter-client\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.516552 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.516416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-federate-client-tls\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.516552 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.516443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/811c13a0-c536-4b9f-85e7-27c020082c98-metrics-client-ca\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.517091 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.517050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/811c13a0-c536-4b9f-85e7-27c020082c98-serving-certs-ca-bundle\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.517224 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.517153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/811c13a0-c536-4b9f-85e7-27c020082c98-metrics-client-ca\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.517358 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.517339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/811c13a0-c536-4b9f-85e7-27c020082c98-telemeter-trusted-ca-bundle\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.519561 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.519536 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-federate-client-tls\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.519561 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.519553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.519738 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.519549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-telemeter-client-tls\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.519738 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.519582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/811c13a0-c536-4b9f-85e7-27c020082c98-secret-telemeter-client\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.529481 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.529453 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trsvv\" (UniqueName: \"kubernetes.io/projected/811c13a0-c536-4b9f-85e7-27c020082c98-kube-api-access-trsvv\") pod \"telemeter-client-855f55c85c-km4hp\" (UID: \"811c13a0-c536-4b9f-85e7-27c020082c98\") " pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.659919 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.659892 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" Apr 23 14:59:36.785806 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:36.785773 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-855f55c85c-km4hp"] Apr 23 14:59:36.788677 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:59:36.788650 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod811c13a0_c536_4b9f_85e7_27c020082c98.slice/crio-a35a4ce668fe042fba5aa7362b9a85989d8e70ca24e8125b947fe65379226254 WatchSource:0}: Error finding container a35a4ce668fe042fba5aa7362b9a85989d8e70ca24e8125b947fe65379226254: Status 404 returned error can't find the container with id a35a4ce668fe042fba5aa7362b9a85989d8e70ca24e8125b947fe65379226254 Apr 23 14:59:37.774877 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:37.774843 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" event={"ID":"811c13a0-c536-4b9f-85e7-27c020082c98","Type":"ContainerStarted","Data":"a35a4ce668fe042fba5aa7362b9a85989d8e70ca24e8125b947fe65379226254"} Apr 23 14:59:40.785941 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:40.785901 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" event={"ID":"811c13a0-c536-4b9f-85e7-27c020082c98","Type":"ContainerStarted","Data":"c89e9090dfb813aba8e09e043b8cc070e18d6be25d6b651796cb140f0470e33b"} Apr 23 14:59:40.785941 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:40.785947 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" event={"ID":"811c13a0-c536-4b9f-85e7-27c020082c98","Type":"ContainerStarted","Data":"6726c0a2ab93d94a9729d43e1522d4845bed9ef59d52215fffb479fb8a7b72f9"} Apr 23 14:59:40.786359 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:40.785963 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" event={"ID":"811c13a0-c536-4b9f-85e7-27c020082c98","Type":"ContainerStarted","Data":"f70c6454cbecc9c0c66435af65ac3b3c58faa0599864a5c4b81a425afa31d529"} Apr 23 14:59:40.812679 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:40.810304 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-855f55c85c-km4hp" podStartSLOduration=1.377558939 podStartE2EDuration="4.810279511s" podCreationTimestamp="2026-04-23 14:59:36 +0000 UTC" firstStartedPulling="2026-04-23 14:59:36.790417589 +0000 UTC m=+137.297499956" lastFinishedPulling="2026-04-23 14:59:40.223138158 +0000 UTC m=+140.730220528" observedRunningTime="2026-04-23 14:59:40.806977784 +0000 UTC m=+141.314060171" watchObservedRunningTime="2026-04-23 14:59:40.810279511 +0000 UTC m=+141.317361900" Apr 23 14:59:41.502088 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.502056 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-77bf44cd89-2wbwq"] Apr 23 14:59:41.505387 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.505370 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.517944 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.517918 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77bf44cd89-2wbwq"] Apr 23 14:59:41.562822 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.562787 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-service-ca\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.562998 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.562831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-trusted-ca-bundle\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.562998 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.562862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-oauth-config\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.562998 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.562901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-serving-cert\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.562998 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.562963 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-console-config\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.563183 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.563025 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86gk6\" (UniqueName: \"kubernetes.io/projected/6c8c0668-8281-480d-bacf-26a469bd5abd-kube-api-access-86gk6\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.563183 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.563054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-oauth-serving-cert\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.664177 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.664133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-service-ca\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.664177 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.664182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-trusted-ca-bundle\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.664415 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.664278 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-oauth-config\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.664476 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.664452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-serving-cert\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.664530 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.664511 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-console-config\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.664587 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.664550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86gk6\" (UniqueName: \"kubernetes.io/projected/6c8c0668-8281-480d-bacf-26a469bd5abd-kube-api-access-86gk6\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.664587 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.664568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-oauth-serving-cert\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.664947 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.664926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-service-ca\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.665088 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.665069 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-trusted-ca-bundle\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.665297 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.665277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-oauth-serving-cert\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.665352 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.665331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-console-config\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.666818 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.666796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-serving-cert\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.667159 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.667141 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-oauth-config\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.674819 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.674793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86gk6\" (UniqueName: \"kubernetes.io/projected/6c8c0668-8281-480d-bacf-26a469bd5abd-kube-api-access-86gk6\") pod \"console-77bf44cd89-2wbwq\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.813744 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.813628 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:41.956811 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:41.956785 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77bf44cd89-2wbwq"] Apr 23 14:59:41.959143 ip-10-0-141-16 kubenswrapper[2577]: W0423 14:59:41.959114 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c8c0668_8281_480d_bacf_26a469bd5abd.slice/crio-41f1f4cb02224dfb1cebde4b10e314514042e15d5b9f7121cbd648c0946385fe WatchSource:0}: Error finding container 41f1f4cb02224dfb1cebde4b10e314514042e15d5b9f7121cbd648c0946385fe: Status 404 returned error can't find the container with id 41f1f4cb02224dfb1cebde4b10e314514042e15d5b9f7121cbd648c0946385fe Apr 23 14:59:42.793368 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:42.793321 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77bf44cd89-2wbwq" event={"ID":"6c8c0668-8281-480d-bacf-26a469bd5abd","Type":"ContainerStarted","Data":"9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501"} Apr 23 14:59:42.793368 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:42.793370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77bf44cd89-2wbwq" event={"ID":"6c8c0668-8281-480d-bacf-26a469bd5abd","Type":"ContainerStarted","Data":"41f1f4cb02224dfb1cebde4b10e314514042e15d5b9f7121cbd648c0946385fe"} Apr 23 14:59:42.814557 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:42.814499 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77bf44cd89-2wbwq" podStartSLOduration=1.814480692 podStartE2EDuration="1.814480692s" podCreationTimestamp="2026-04-23 14:59:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:59:42.813928841 +0000 UTC m=+143.321011229" watchObservedRunningTime="2026-04-23 14:59:42.814480692 +0000 UTC m=+143.321563077" Apr 23 14:59:51.814747 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:51.814681 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:51.815122 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:51.814760 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:51.819855 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:51.819830 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:51.823502 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:51.823481 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 14:59:51.883498 ip-10-0-141-16 kubenswrapper[2577]: I0423 14:59:51.883465 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78c5b7b795-rswtk"] Apr 23 15:00:16.902884 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:16.902827 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-78c5b7b795-rswtk" podUID="d762ca36-74c1-4055-b0c8-bda608d1c686" containerName="console" containerID="cri-o://f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6" gracePeriod=15 Apr 23 15:00:17.137915 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.137893 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78c5b7b795-rswtk_d762ca36-74c1-4055-b0c8-bda608d1c686/console/0.log" Apr 23 15:00:17.138037 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.137956 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 15:00:17.159356 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.159273 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-oauth-serving-cert\") pod \"d762ca36-74c1-4055-b0c8-bda608d1c686\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " Apr 23 15:00:17.159356 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.159329 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-oauth-config\") pod \"d762ca36-74c1-4055-b0c8-bda608d1c686\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " Apr 23 15:00:17.159544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.159374 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts7rp\" (UniqueName: \"kubernetes.io/projected/d762ca36-74c1-4055-b0c8-bda608d1c686-kube-api-access-ts7rp\") pod \"d762ca36-74c1-4055-b0c8-bda608d1c686\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " Apr 23 15:00:17.159544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.159444 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-console-config\") pod \"d762ca36-74c1-4055-b0c8-bda608d1c686\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " Apr 23 15:00:17.159544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.159474 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-service-ca\") pod \"d762ca36-74c1-4055-b0c8-bda608d1c686\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " Apr 23 15:00:17.159544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.159499 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-serving-cert\") pod \"d762ca36-74c1-4055-b0c8-bda608d1c686\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " Apr 23 15:00:17.159544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.159528 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-trusted-ca-bundle\") pod \"d762ca36-74c1-4055-b0c8-bda608d1c686\" (UID: \"d762ca36-74c1-4055-b0c8-bda608d1c686\") " Apr 23 15:00:17.159847 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.159694 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d762ca36-74c1-4055-b0c8-bda608d1c686" (UID: "d762ca36-74c1-4055-b0c8-bda608d1c686"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:00:17.159966 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.159872 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-service-ca" (OuterVolumeSpecName: "service-ca") pod "d762ca36-74c1-4055-b0c8-bda608d1c686" (UID: "d762ca36-74c1-4055-b0c8-bda608d1c686"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:00:17.160146 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.160119 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-console-config" (OuterVolumeSpecName: "console-config") pod "d762ca36-74c1-4055-b0c8-bda608d1c686" (UID: "d762ca36-74c1-4055-b0c8-bda608d1c686"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:00:17.160146 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.160127 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d762ca36-74c1-4055-b0c8-bda608d1c686" (UID: "d762ca36-74c1-4055-b0c8-bda608d1c686"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:00:17.161994 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.161965 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d762ca36-74c1-4055-b0c8-bda608d1c686" (UID: "d762ca36-74c1-4055-b0c8-bda608d1c686"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 15:00:17.162078 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.161996 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d762ca36-74c1-4055-b0c8-bda608d1c686-kube-api-access-ts7rp" (OuterVolumeSpecName: "kube-api-access-ts7rp") pod "d762ca36-74c1-4055-b0c8-bda608d1c686" (UID: "d762ca36-74c1-4055-b0c8-bda608d1c686"). InnerVolumeSpecName "kube-api-access-ts7rp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 15:00:17.162078 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.162051 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d762ca36-74c1-4055-b0c8-bda608d1c686" (UID: "d762ca36-74c1-4055-b0c8-bda608d1c686"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 15:00:17.260129 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.260089 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-oauth-serving-cert\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:00:17.260129 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.260122 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-oauth-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:00:17.260129 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.260131 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ts7rp\" (UniqueName: \"kubernetes.io/projected/d762ca36-74c1-4055-b0c8-bda608d1c686-kube-api-access-ts7rp\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:00:17.260373 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.260142 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-console-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:00:17.260373 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.260151 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-service-ca\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:00:17.260373 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.260161 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d762ca36-74c1-4055-b0c8-bda608d1c686-console-serving-cert\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:00:17.260373 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.260169 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d762ca36-74c1-4055-b0c8-bda608d1c686-trusted-ca-bundle\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:00:17.893356 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.893325 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78c5b7b795-rswtk_d762ca36-74c1-4055-b0c8-bda608d1c686/console/0.log" Apr 23 15:00:17.893530 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.893363 2577 generic.go:358] "Generic (PLEG): container finished" podID="d762ca36-74c1-4055-b0c8-bda608d1c686" containerID="f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6" exitCode=2 Apr 23 15:00:17.893530 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.893433 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78c5b7b795-rswtk" Apr 23 15:00:17.893530 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.893433 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78c5b7b795-rswtk" event={"ID":"d762ca36-74c1-4055-b0c8-bda608d1c686","Type":"ContainerDied","Data":"f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6"} Apr 23 15:00:17.893641 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.893533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78c5b7b795-rswtk" event={"ID":"d762ca36-74c1-4055-b0c8-bda608d1c686","Type":"ContainerDied","Data":"3b4c0fd98785f509167afc0b920b17f6c31af9792e14a4cb252951a8624d7c4a"} Apr 23 15:00:17.893641 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.893548 2577 scope.go:117] "RemoveContainer" containerID="f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6" Apr 23 15:00:17.901649 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.901625 2577 scope.go:117] "RemoveContainer" containerID="f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6" Apr 23 15:00:17.902009 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:00:17.901988 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6\": container with ID starting with f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6 not found: ID does not exist" containerID="f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6" Apr 23 15:00:17.902086 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.902017 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6"} err="failed to get container status \"f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6\": rpc error: code = NotFound desc = could not find container \"f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6\": container with ID starting with f47d86fabe29a376e8fd614039173b5f0f2368aa6664f01b15708115c29510a6 not found: ID does not exist" Apr 23 15:00:17.917060 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.917034 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78c5b7b795-rswtk"] Apr 23 15:00:17.919535 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:17.919509 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78c5b7b795-rswtk"] Apr 23 15:00:18.097558 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:00:18.097521 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d762ca36-74c1-4055-b0c8-bda608d1c686" path="/var/lib/kubelet/pods/d762ca36-74c1-4055-b0c8-bda608d1c686/volumes" Apr 23 15:01:04.096374 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.096338 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67649dbb8b-vpg24"] Apr 23 15:01:04.096858 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.096619 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d762ca36-74c1-4055-b0c8-bda608d1c686" containerName="console" Apr 23 15:01:04.096858 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.096630 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d762ca36-74c1-4055-b0c8-bda608d1c686" containerName="console" Apr 23 15:01:04.096858 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.096684 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d762ca36-74c1-4055-b0c8-bda608d1c686" containerName="console" Apr 23 15:01:04.099421 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.099404 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.110811 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.110773 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67649dbb8b-vpg24"] Apr 23 15:01:04.246861 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.246816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-oauth-config\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.246861 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.246857 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-console-config\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.247093 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.246880 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-serving-cert\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.247093 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.246902 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jvcn\" (UniqueName: \"kubernetes.io/projected/de5a4ec9-f768-4280-8455-53c89217cd36-kube-api-access-9jvcn\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.247093 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.246920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-service-ca\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.247093 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.246940 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-trusted-ca-bundle\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.247093 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.246962 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-oauth-serving-cert\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.347798 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.347713 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-trusted-ca-bundle\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.347798 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.347755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-oauth-serving-cert\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.347798 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.347800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-oauth-config\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.348039 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.347819 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-console-config\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.348039 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.347838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-serving-cert\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.348039 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.347936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jvcn\" (UniqueName: \"kubernetes.io/projected/de5a4ec9-f768-4280-8455-53c89217cd36-kube-api-access-9jvcn\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.348039 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.347974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-service-ca\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.348595 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.348572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-trusted-ca-bundle\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.348717 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.348614 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-console-config\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.348717 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.348663 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-service-ca\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.348717 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.348660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-oauth-serving-cert\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.350306 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.350288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-oauth-config\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.350391 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.350366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-serving-cert\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.357418 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.357394 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jvcn\" (UniqueName: \"kubernetes.io/projected/de5a4ec9-f768-4280-8455-53c89217cd36-kube-api-access-9jvcn\") pod \"console-67649dbb8b-vpg24\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.408978 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.408937 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:04.531824 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:04.531794 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67649dbb8b-vpg24"] Apr 23 15:01:04.535656 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:01:04.535628 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5a4ec9_f768_4280_8455_53c89217cd36.slice/crio-53a07465246a78b2a56242724024435f1c5300a8b88fa1a51460ce9225132afa WatchSource:0}: Error finding container 53a07465246a78b2a56242724024435f1c5300a8b88fa1a51460ce9225132afa: Status 404 returned error can't find the container with id 53a07465246a78b2a56242724024435f1c5300a8b88fa1a51460ce9225132afa Apr 23 15:01:05.018590 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:05.018549 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67649dbb8b-vpg24" event={"ID":"de5a4ec9-f768-4280-8455-53c89217cd36","Type":"ContainerStarted","Data":"240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114"} Apr 23 15:01:05.018590 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:05.018595 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67649dbb8b-vpg24" event={"ID":"de5a4ec9-f768-4280-8455-53c89217cd36","Type":"ContainerStarted","Data":"53a07465246a78b2a56242724024435f1c5300a8b88fa1a51460ce9225132afa"} Apr 23 15:01:05.038264 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:05.038155 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67649dbb8b-vpg24" podStartSLOduration=1.038134546 podStartE2EDuration="1.038134546s" podCreationTimestamp="2026-04-23 15:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 15:01:05.036920013 +0000 UTC m=+225.544002434" watchObservedRunningTime="2026-04-23 15:01:05.038134546 +0000 UTC m=+225.545216922" Apr 23 15:01:14.409284 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:14.409236 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:14.409790 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:14.409301 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:14.414269 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:14.414241 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:15.049095 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:15.049062 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:01:15.102174 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:15.102145 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77bf44cd89-2wbwq"] Apr 23 15:01:36.536500 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.536418 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fpdpm"] Apr 23 15:01:36.539824 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.539805 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fpdpm" Apr 23 15:01:36.542515 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.542498 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 15:01:36.547065 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.547043 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fpdpm"] Apr 23 15:01:36.604813 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.604771 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9bcef650-4335-4e0e-bbf1-0e9794d90e8c-original-pull-secret\") pod \"global-pull-secret-syncer-fpdpm\" (UID: \"9bcef650-4335-4e0e-bbf1-0e9794d90e8c\") " pod="kube-system/global-pull-secret-syncer-fpdpm" Apr 23 15:01:36.604813 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.604821 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9bcef650-4335-4e0e-bbf1-0e9794d90e8c-kubelet-config\") pod \"global-pull-secret-syncer-fpdpm\" (UID: \"9bcef650-4335-4e0e-bbf1-0e9794d90e8c\") " pod="kube-system/global-pull-secret-syncer-fpdpm" Apr 23 15:01:36.605027 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.604844 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9bcef650-4335-4e0e-bbf1-0e9794d90e8c-dbus\") pod \"global-pull-secret-syncer-fpdpm\" (UID: \"9bcef650-4335-4e0e-bbf1-0e9794d90e8c\") " pod="kube-system/global-pull-secret-syncer-fpdpm" Apr 23 15:01:36.705855 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.705811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9bcef650-4335-4e0e-bbf1-0e9794d90e8c-original-pull-secret\") pod \"global-pull-secret-syncer-fpdpm\" (UID: \"9bcef650-4335-4e0e-bbf1-0e9794d90e8c\") " pod="kube-system/global-pull-secret-syncer-fpdpm" Apr 23 15:01:36.706042 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.705867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9bcef650-4335-4e0e-bbf1-0e9794d90e8c-kubelet-config\") pod \"global-pull-secret-syncer-fpdpm\" (UID: \"9bcef650-4335-4e0e-bbf1-0e9794d90e8c\") " pod="kube-system/global-pull-secret-syncer-fpdpm" Apr 23 15:01:36.706042 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.705894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9bcef650-4335-4e0e-bbf1-0e9794d90e8c-dbus\") pod \"global-pull-secret-syncer-fpdpm\" (UID: \"9bcef650-4335-4e0e-bbf1-0e9794d90e8c\") " pod="kube-system/global-pull-secret-syncer-fpdpm" Apr 23 15:01:36.706042 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.705969 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9bcef650-4335-4e0e-bbf1-0e9794d90e8c-kubelet-config\") pod \"global-pull-secret-syncer-fpdpm\" (UID: \"9bcef650-4335-4e0e-bbf1-0e9794d90e8c\") " pod="kube-system/global-pull-secret-syncer-fpdpm" Apr 23 15:01:36.706153 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.706048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9bcef650-4335-4e0e-bbf1-0e9794d90e8c-dbus\") pod \"global-pull-secret-syncer-fpdpm\" (UID: \"9bcef650-4335-4e0e-bbf1-0e9794d90e8c\") " pod="kube-system/global-pull-secret-syncer-fpdpm" Apr 23 15:01:36.708076 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.708058 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9bcef650-4335-4e0e-bbf1-0e9794d90e8c-original-pull-secret\") pod \"global-pull-secret-syncer-fpdpm\" (UID: \"9bcef650-4335-4e0e-bbf1-0e9794d90e8c\") " pod="kube-system/global-pull-secret-syncer-fpdpm" Apr 23 15:01:36.850038 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.849938 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fpdpm" Apr 23 15:01:36.972121 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:36.972089 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fpdpm"] Apr 23 15:01:36.975401 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:01:36.975361 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bcef650_4335_4e0e_bbf1_0e9794d90e8c.slice/crio-f3f2998707806d609f5ffaa79361b988e5eeaf27e6003f295bb7287ae67756d3 WatchSource:0}: Error finding container f3f2998707806d609f5ffaa79361b988e5eeaf27e6003f295bb7287ae67756d3: Status 404 returned error can't find the container with id f3f2998707806d609f5ffaa79361b988e5eeaf27e6003f295bb7287ae67756d3 Apr 23 15:01:37.105230 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:37.105142 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fpdpm" event={"ID":"9bcef650-4335-4e0e-bbf1-0e9794d90e8c","Type":"ContainerStarted","Data":"f3f2998707806d609f5ffaa79361b988e5eeaf27e6003f295bb7287ae67756d3"} Apr 23 15:01:40.124603 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.124550 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-77bf44cd89-2wbwq" podUID="6c8c0668-8281-480d-bacf-26a469bd5abd" containerName="console" containerID="cri-o://9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501" gracePeriod=15 Apr 23 15:01:40.364644 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.364622 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77bf44cd89-2wbwq_6c8c0668-8281-480d-bacf-26a469bd5abd/console/0.log" Apr 23 15:01:40.364783 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.364758 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 15:01:40.442291 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.442254 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-trusted-ca-bundle\") pod \"6c8c0668-8281-480d-bacf-26a469bd5abd\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " Apr 23 15:01:40.442516 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.442302 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-serving-cert\") pod \"6c8c0668-8281-480d-bacf-26a469bd5abd\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " Apr 23 15:01:40.442516 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.442328 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-console-config\") pod \"6c8c0668-8281-480d-bacf-26a469bd5abd\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " Apr 23 15:01:40.442516 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.442360 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86gk6\" (UniqueName: \"kubernetes.io/projected/6c8c0668-8281-480d-bacf-26a469bd5abd-kube-api-access-86gk6\") pod \"6c8c0668-8281-480d-bacf-26a469bd5abd\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " Apr 23 15:01:40.442516 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.442390 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-oauth-config\") pod \"6c8c0668-8281-480d-bacf-26a469bd5abd\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " Apr 23 15:01:40.442516 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.442438 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-service-ca\") pod \"6c8c0668-8281-480d-bacf-26a469bd5abd\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " Apr 23 15:01:40.442516 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.442458 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-oauth-serving-cert\") pod \"6c8c0668-8281-480d-bacf-26a469bd5abd\" (UID: \"6c8c0668-8281-480d-bacf-26a469bd5abd\") " Apr 23 15:01:40.442910 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.442792 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6c8c0668-8281-480d-bacf-26a469bd5abd" (UID: "6c8c0668-8281-480d-bacf-26a469bd5abd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:01:40.442910 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.442817 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-console-config" (OuterVolumeSpecName: "console-config") pod "6c8c0668-8281-480d-bacf-26a469bd5abd" (UID: "6c8c0668-8281-480d-bacf-26a469bd5abd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:01:40.443009 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.442952 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-service-ca" (OuterVolumeSpecName: "service-ca") pod "6c8c0668-8281-480d-bacf-26a469bd5abd" (UID: "6c8c0668-8281-480d-bacf-26a469bd5abd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:01:40.443009 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.443002 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6c8c0668-8281-480d-bacf-26a469bd5abd" (UID: "6c8c0668-8281-480d-bacf-26a469bd5abd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:01:40.444552 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.444526 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8c0668-8281-480d-bacf-26a469bd5abd-kube-api-access-86gk6" (OuterVolumeSpecName: "kube-api-access-86gk6") pod "6c8c0668-8281-480d-bacf-26a469bd5abd" (UID: "6c8c0668-8281-480d-bacf-26a469bd5abd"). InnerVolumeSpecName "kube-api-access-86gk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 15:01:40.444729 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.444710 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6c8c0668-8281-480d-bacf-26a469bd5abd" (UID: "6c8c0668-8281-480d-bacf-26a469bd5abd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 15:01:40.444781 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.444760 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6c8c0668-8281-480d-bacf-26a469bd5abd" (UID: "6c8c0668-8281-480d-bacf-26a469bd5abd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 15:01:40.543062 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.543025 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-trusted-ca-bundle\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:01:40.543062 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.543055 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-serving-cert\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:01:40.543062 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.543066 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-console-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:01:40.543295 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.543077 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-86gk6\" (UniqueName: \"kubernetes.io/projected/6c8c0668-8281-480d-bacf-26a469bd5abd-kube-api-access-86gk6\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:01:40.543295 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.543087 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c8c0668-8281-480d-bacf-26a469bd5abd-console-oauth-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:01:40.543295 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.543095 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-service-ca\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:01:40.543295 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:40.543104 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c8c0668-8281-480d-bacf-26a469bd5abd-oauth-serving-cert\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:01:41.117327 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:41.117298 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-77bf44cd89-2wbwq_6c8c0668-8281-480d-bacf-26a469bd5abd/console/0.log" Apr 23 15:01:41.117493 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:41.117339 2577 generic.go:358] "Generic (PLEG): container finished" podID="6c8c0668-8281-480d-bacf-26a469bd5abd" containerID="9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501" exitCode=2 Apr 23 15:01:41.117493 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:41.117374 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77bf44cd89-2wbwq" event={"ID":"6c8c0668-8281-480d-bacf-26a469bd5abd","Type":"ContainerDied","Data":"9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501"} Apr 23 15:01:41.117493 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:41.117397 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77bf44cd89-2wbwq" event={"ID":"6c8c0668-8281-480d-bacf-26a469bd5abd","Type":"ContainerDied","Data":"41f1f4cb02224dfb1cebde4b10e314514042e15d5b9f7121cbd648c0946385fe"} Apr 23 15:01:41.117493 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:41.117411 2577 scope.go:117] "RemoveContainer" containerID="9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501" Apr 23 15:01:41.117493 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:41.117442 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77bf44cd89-2wbwq" Apr 23 15:01:41.125056 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:41.125036 2577 scope.go:117] "RemoveContainer" containerID="9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501" Apr 23 15:01:41.125328 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:01:41.125276 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501\": container with ID starting with 9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501 not found: ID does not exist" containerID="9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501" Apr 23 15:01:41.125328 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:41.125303 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501"} err="failed to get container status \"9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501\": rpc error: code = NotFound desc = could not find container \"9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501\": container with ID starting with 9094902cb1feae5610ff68a48e0e4b34e2055dcc94fa69c797aac8a09a726501 not found: ID does not exist" Apr 23 15:01:41.138574 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:41.138548 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-77bf44cd89-2wbwq"] Apr 23 15:01:41.142883 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:41.142861 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-77bf44cd89-2wbwq"] Apr 23 15:01:42.093393 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:42.093359 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c8c0668-8281-480d-bacf-26a469bd5abd" path="/var/lib/kubelet/pods/6c8c0668-8281-480d-bacf-26a469bd5abd/volumes" Apr 23 15:01:46.133761 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:46.133719 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fpdpm" event={"ID":"9bcef650-4335-4e0e-bbf1-0e9794d90e8c","Type":"ContainerStarted","Data":"85a049e71b16087762be7129683cd5e98f6dd2f4a51793e23e64c3022fcd328c"} Apr 23 15:01:46.153339 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:01:46.152759 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fpdpm" podStartSLOduration=1.495756424 podStartE2EDuration="10.152742051s" podCreationTimestamp="2026-04-23 15:01:36 +0000 UTC" firstStartedPulling="2026-04-23 15:01:36.977049739 +0000 UTC m=+257.484132104" lastFinishedPulling="2026-04-23 15:01:45.634035362 +0000 UTC m=+266.141117731" observedRunningTime="2026-04-23 15:01:46.151584572 +0000 UTC m=+266.658666957" watchObservedRunningTime="2026-04-23 15:01:46.152742051 +0000 UTC m=+266.659824445" Apr 23 15:02:19.439064 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.439034 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt"] Apr 23 15:02:19.439548 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.439331 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c8c0668-8281-480d-bacf-26a469bd5abd" containerName="console" Apr 23 15:02:19.439548 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.439343 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8c0668-8281-480d-bacf-26a469bd5abd" containerName="console" Apr 23 15:02:19.439548 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.439409 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c8c0668-8281-480d-bacf-26a469bd5abd" containerName="console" Apr 23 15:02:19.462104 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.462071 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt"] Apr 23 15:02:19.462267 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.462224 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:19.464970 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.464943 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 15:02:19.465114 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.464988 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 15:02:19.465610 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.465592 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x5mt8\"" Apr 23 15:02:19.556979 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.556938 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:19.556979 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.556980 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm2tz\" (UniqueName: \"kubernetes.io/projected/0f59706e-6326-48dc-8079-51dec591f896-kube-api-access-nm2tz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:19.557197 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.557070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:19.657935 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.657893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:19.658150 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.657989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:19.658150 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.658013 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nm2tz\" (UniqueName: \"kubernetes.io/projected/0f59706e-6326-48dc-8079-51dec591f896-kube-api-access-nm2tz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:19.658359 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.658337 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:19.658425 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.658390 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:19.667457 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.667435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm2tz\" (UniqueName: \"kubernetes.io/projected/0f59706e-6326-48dc-8079-51dec591f896-kube-api-access-nm2tz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:19.771798 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.771691 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:19.892342 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.892304 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt"] Apr 23 15:02:19.895909 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:02:19.895882 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f59706e_6326_48dc_8079_51dec591f896.slice/crio-e548555612b6d51f14f1722b3d9b23e3a1351fe57ca5723b0f8b30f5e2efb098 WatchSource:0}: Error finding container e548555612b6d51f14f1722b3d9b23e3a1351fe57ca5723b0f8b30f5e2efb098: Status 404 returned error can't find the container with id e548555612b6d51f14f1722b3d9b23e3a1351fe57ca5723b0f8b30f5e2efb098 Apr 23 15:02:19.966052 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.966024 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:02:19.967099 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:19.967078 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:02:20.229355 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:20.229319 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" event={"ID":"0f59706e-6326-48dc-8079-51dec591f896","Type":"ContainerStarted","Data":"e548555612b6d51f14f1722b3d9b23e3a1351fe57ca5723b0f8b30f5e2efb098"} Apr 23 15:02:27.253189 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:27.253149 2577 generic.go:358] "Generic (PLEG): container finished" podID="0f59706e-6326-48dc-8079-51dec591f896" containerID="8a7cfe6b3a14e37036f1598a345dbe8a08f1688d0085354b0ee9eb2936556815" exitCode=0 Apr 23 15:02:27.253556 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:27.253208 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" event={"ID":"0f59706e-6326-48dc-8079-51dec591f896","Type":"ContainerDied","Data":"8a7cfe6b3a14e37036f1598a345dbe8a08f1688d0085354b0ee9eb2936556815"} Apr 23 15:02:27.254181 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:27.254167 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 15:02:30.262310 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:30.262268 2577 generic.go:358] "Generic (PLEG): container finished" podID="0f59706e-6326-48dc-8079-51dec591f896" containerID="a96b177083b479788a5e378b7e5a9eb8bfec3275e3b89c1c4a1aa89b8f6b5738" exitCode=0 Apr 23 15:02:30.262801 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:30.262351 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" event={"ID":"0f59706e-6326-48dc-8079-51dec591f896","Type":"ContainerDied","Data":"a96b177083b479788a5e378b7e5a9eb8bfec3275e3b89c1c4a1aa89b8f6b5738"} Apr 23 15:02:36.283293 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:36.283251 2577 generic.go:358] "Generic (PLEG): container finished" podID="0f59706e-6326-48dc-8079-51dec591f896" containerID="e782a800b74cf9534a5ee9c55e14bd755b4c1dcf418d4ba0dca3b769db345e4b" exitCode=0 Apr 23 15:02:36.283657 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:36.283336 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" event={"ID":"0f59706e-6326-48dc-8079-51dec591f896","Type":"ContainerDied","Data":"e782a800b74cf9534a5ee9c55e14bd755b4c1dcf418d4ba0dca3b769db345e4b"} Apr 23 15:02:37.404970 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:37.404947 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:37.519610 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:37.519571 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-bundle\") pod \"0f59706e-6326-48dc-8079-51dec591f896\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " Apr 23 15:02:37.519832 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:37.519660 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-util\") pod \"0f59706e-6326-48dc-8079-51dec591f896\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " Apr 23 15:02:37.519832 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:37.519724 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm2tz\" (UniqueName: \"kubernetes.io/projected/0f59706e-6326-48dc-8079-51dec591f896-kube-api-access-nm2tz\") pod \"0f59706e-6326-48dc-8079-51dec591f896\" (UID: \"0f59706e-6326-48dc-8079-51dec591f896\") " Apr 23 15:02:37.520307 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:37.520279 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-bundle" (OuterVolumeSpecName: "bundle") pod "0f59706e-6326-48dc-8079-51dec591f896" (UID: "0f59706e-6326-48dc-8079-51dec591f896"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 15:02:37.521898 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:37.521854 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f59706e-6326-48dc-8079-51dec591f896-kube-api-access-nm2tz" (OuterVolumeSpecName: "kube-api-access-nm2tz") pod "0f59706e-6326-48dc-8079-51dec591f896" (UID: "0f59706e-6326-48dc-8079-51dec591f896"). InnerVolumeSpecName "kube-api-access-nm2tz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 15:02:37.523894 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:37.523870 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-util" (OuterVolumeSpecName: "util") pod "0f59706e-6326-48dc-8079-51dec591f896" (UID: "0f59706e-6326-48dc-8079-51dec591f896"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 15:02:37.620936 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:37.620854 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nm2tz\" (UniqueName: \"kubernetes.io/projected/0f59706e-6326-48dc-8079-51dec591f896-kube-api-access-nm2tz\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:02:37.620936 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:37.620883 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-bundle\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:02:37.620936 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:37.620893 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f59706e-6326-48dc-8079-51dec591f896-util\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:02:38.290374 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:38.290335 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" event={"ID":"0f59706e-6326-48dc-8079-51dec591f896","Type":"ContainerDied","Data":"e548555612b6d51f14f1722b3d9b23e3a1351fe57ca5723b0f8b30f5e2efb098"} Apr 23 15:02:38.290374 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:38.290365 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dncrpt" Apr 23 15:02:38.290576 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:38.290372 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e548555612b6d51f14f1722b3d9b23e3a1351fe57ca5723b0f8b30f5e2efb098" Apr 23 15:02:42.563865 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.563831 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v"] Apr 23 15:02:42.564251 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.564106 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f59706e-6326-48dc-8079-51dec591f896" containerName="pull" Apr 23 15:02:42.564251 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.564116 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f59706e-6326-48dc-8079-51dec591f896" containerName="pull" Apr 23 15:02:42.564251 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.564133 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f59706e-6326-48dc-8079-51dec591f896" containerName="extract" Apr 23 15:02:42.564251 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.564138 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f59706e-6326-48dc-8079-51dec591f896" containerName="extract" Apr 23 15:02:42.564251 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.564149 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f59706e-6326-48dc-8079-51dec591f896" containerName="util" Apr 23 15:02:42.564251 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.564155 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f59706e-6326-48dc-8079-51dec591f896" containerName="util" Apr 23 15:02:42.564251 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.564205 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f59706e-6326-48dc-8079-51dec591f896" containerName="extract" Apr 23 15:02:42.611750 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.611685 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v"] Apr 23 15:02:42.611888 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.611843 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v" Apr 23 15:02:42.614690 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.614670 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 23 15:02:42.614792 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.614670 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-kbgtd\"" Apr 23 15:02:42.614792 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.614721 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 23 15:02:42.763729 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.763651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e30fe2b-8c30-4f72-a64b-325a196789bd-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-ldn7v\" (UID: \"5e30fe2b-8c30-4f72-a64b-325a196789bd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v" Apr 23 15:02:42.763925 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.763805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7j6f\" (UniqueName: \"kubernetes.io/projected/5e30fe2b-8c30-4f72-a64b-325a196789bd-kube-api-access-d7j6f\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-ldn7v\" (UID: \"5e30fe2b-8c30-4f72-a64b-325a196789bd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v" Apr 23 15:02:42.865051 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.864973 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e30fe2b-8c30-4f72-a64b-325a196789bd-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-ldn7v\" (UID: \"5e30fe2b-8c30-4f72-a64b-325a196789bd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v" Apr 23 15:02:42.865051 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.865028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7j6f\" (UniqueName: \"kubernetes.io/projected/5e30fe2b-8c30-4f72-a64b-325a196789bd-kube-api-access-d7j6f\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-ldn7v\" (UID: \"5e30fe2b-8c30-4f72-a64b-325a196789bd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v" Apr 23 15:02:42.865400 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.865376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e30fe2b-8c30-4f72-a64b-325a196789bd-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-ldn7v\" (UID: \"5e30fe2b-8c30-4f72-a64b-325a196789bd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v" Apr 23 15:02:42.876220 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.876192 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7j6f\" (UniqueName: \"kubernetes.io/projected/5e30fe2b-8c30-4f72-a64b-325a196789bd-kube-api-access-d7j6f\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-ldn7v\" (UID: \"5e30fe2b-8c30-4f72-a64b-325a196789bd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v" Apr 23 15:02:42.920557 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:42.920526 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v" Apr 23 15:02:43.045093 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:43.044947 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v"] Apr 23 15:02:43.048858 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:02:43.048825 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e30fe2b_8c30_4f72_a64b_325a196789bd.slice/crio-d4ba65d082bd5b70c3d712a67dc8e02b22d33770db42173c9e04e9ae98fcc259 WatchSource:0}: Error finding container d4ba65d082bd5b70c3d712a67dc8e02b22d33770db42173c9e04e9ae98fcc259: Status 404 returned error can't find the container with id d4ba65d082bd5b70c3d712a67dc8e02b22d33770db42173c9e04e9ae98fcc259 Apr 23 15:02:43.305986 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:43.305946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v" event={"ID":"5e30fe2b-8c30-4f72-a64b-325a196789bd","Type":"ContainerStarted","Data":"d4ba65d082bd5b70c3d712a67dc8e02b22d33770db42173c9e04e9ae98fcc259"} Apr 23 15:02:46.316827 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:46.316781 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v" event={"ID":"5e30fe2b-8c30-4f72-a64b-325a196789bd","Type":"ContainerStarted","Data":"c96a6c3e93d4875aa45b948c3bb8d0e243b0dc37114b469bbcb7ae6b3d61d5d8"} Apr 23 15:02:46.340947 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:46.340883 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-ldn7v" podStartSLOduration=1.808144961 podStartE2EDuration="4.340862984s" podCreationTimestamp="2026-04-23 15:02:42 +0000 UTC" firstStartedPulling="2026-04-23 15:02:43.051283055 +0000 UTC m=+323.558365422" lastFinishedPulling="2026-04-23 15:02:45.584001082 +0000 UTC m=+326.091083445" observedRunningTime="2026-04-23 15:02:46.338391928 +0000 UTC m=+326.845474314" watchObservedRunningTime="2026-04-23 15:02:46.340862984 +0000 UTC m=+326.847945370" Apr 23 15:02:49.033929 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.033897 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-x67qz"] Apr 23 15:02:49.037206 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.037186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" Apr 23 15:02:49.041366 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.041346 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-d92j4\"" Apr 23 15:02:49.042063 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.042041 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 15:02:49.042166 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.042048 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 15:02:49.048929 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.048906 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-x67qz"] Apr 23 15:02:49.220809 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.220767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e914e75-9252-4043-a24b-3d8744d6d179-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-x67qz\" (UID: \"9e914e75-9252-4043-a24b-3d8744d6d179\") " pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" Apr 23 15:02:49.220976 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.220856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2sdt\" (UniqueName: \"kubernetes.io/projected/9e914e75-9252-4043-a24b-3d8744d6d179-kube-api-access-l2sdt\") pod \"cert-manager-webhook-587ccfb98-x67qz\" (UID: \"9e914e75-9252-4043-a24b-3d8744d6d179\") " pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" Apr 23 15:02:49.321605 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.321526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e914e75-9252-4043-a24b-3d8744d6d179-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-x67qz\" (UID: \"9e914e75-9252-4043-a24b-3d8744d6d179\") " pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" Apr 23 15:02:49.321605 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.321577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2sdt\" (UniqueName: \"kubernetes.io/projected/9e914e75-9252-4043-a24b-3d8744d6d179-kube-api-access-l2sdt\") pod \"cert-manager-webhook-587ccfb98-x67qz\" (UID: \"9e914e75-9252-4043-a24b-3d8744d6d179\") " pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" Apr 23 15:02:49.343032 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.342993 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e914e75-9252-4043-a24b-3d8744d6d179-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-x67qz\" (UID: \"9e914e75-9252-4043-a24b-3d8744d6d179\") " pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" Apr 23 15:02:49.344902 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.344877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2sdt\" (UniqueName: \"kubernetes.io/projected/9e914e75-9252-4043-a24b-3d8744d6d179-kube-api-access-l2sdt\") pod \"cert-manager-webhook-587ccfb98-x67qz\" (UID: \"9e914e75-9252-4043-a24b-3d8744d6d179\") " pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" Apr 23 15:02:49.364750 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.364694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" Apr 23 15:02:49.487976 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:49.487947 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-x67qz"] Apr 23 15:02:49.490535 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:02:49.490504 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e914e75_9252_4043_a24b_3d8744d6d179.slice/crio-097d3094332827fef319a5fbe2070d10b580369a4fed2cfd162739951b8d32f8 WatchSource:0}: Error finding container 097d3094332827fef319a5fbe2070d10b580369a4fed2cfd162739951b8d32f8: Status 404 returned error can't find the container with id 097d3094332827fef319a5fbe2070d10b580369a4fed2cfd162739951b8d32f8 Apr 23 15:02:50.174064 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.174027 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-kmpqj"] Apr 23 15:02:50.179491 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.179472 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-kmpqj" Apr 23 15:02:50.182449 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.182424 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-424gh\"" Apr 23 15:02:50.191510 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.191482 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-kmpqj"] Apr 23 15:02:50.329287 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.329252 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" event={"ID":"9e914e75-9252-4043-a24b-3d8744d6d179","Type":"ContainerStarted","Data":"097d3094332827fef319a5fbe2070d10b580369a4fed2cfd162739951b8d32f8"} Apr 23 15:02:50.331714 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.331673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/303b1a25-6110-4486-a417-69a196ea5719-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-kmpqj\" (UID: \"303b1a25-6110-4486-a417-69a196ea5719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kmpqj" Apr 23 15:02:50.331827 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.331745 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq8gg\" (UniqueName: \"kubernetes.io/projected/303b1a25-6110-4486-a417-69a196ea5719-kube-api-access-jq8gg\") pod \"cert-manager-cainjector-68b757865b-kmpqj\" (UID: \"303b1a25-6110-4486-a417-69a196ea5719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kmpqj" Apr 23 15:02:50.433132 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.433044 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq8gg\" (UniqueName: \"kubernetes.io/projected/303b1a25-6110-4486-a417-69a196ea5719-kube-api-access-jq8gg\") pod \"cert-manager-cainjector-68b757865b-kmpqj\" (UID: \"303b1a25-6110-4486-a417-69a196ea5719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kmpqj" Apr 23 15:02:50.433308 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.433172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/303b1a25-6110-4486-a417-69a196ea5719-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-kmpqj\" (UID: \"303b1a25-6110-4486-a417-69a196ea5719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kmpqj" Apr 23 15:02:50.446348 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.446315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq8gg\" (UniqueName: \"kubernetes.io/projected/303b1a25-6110-4486-a417-69a196ea5719-kube-api-access-jq8gg\") pod \"cert-manager-cainjector-68b757865b-kmpqj\" (UID: \"303b1a25-6110-4486-a417-69a196ea5719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kmpqj" Apr 23 15:02:50.447158 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.447133 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/303b1a25-6110-4486-a417-69a196ea5719-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-kmpqj\" (UID: \"303b1a25-6110-4486-a417-69a196ea5719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-kmpqj" Apr 23 15:02:50.489100 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.489059 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-kmpqj" Apr 23 15:02:50.627479 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:50.627429 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-kmpqj"] Apr 23 15:02:50.629885 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:02:50.629851 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod303b1a25_6110_4486_a417_69a196ea5719.slice/crio-446a8b515a741c8eb9b8fc10d2ba977ed71b20b9549140bc81f1a9b1a87ac72a WatchSource:0}: Error finding container 446a8b515a741c8eb9b8fc10d2ba977ed71b20b9549140bc81f1a9b1a87ac72a: Status 404 returned error can't find the container with id 446a8b515a741c8eb9b8fc10d2ba977ed71b20b9549140bc81f1a9b1a87ac72a Apr 23 15:02:51.334827 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:51.334767 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-kmpqj" event={"ID":"303b1a25-6110-4486-a417-69a196ea5719","Type":"ContainerStarted","Data":"446a8b515a741c8eb9b8fc10d2ba977ed71b20b9549140bc81f1a9b1a87ac72a"} Apr 23 15:02:53.342457 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:53.342407 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" event={"ID":"9e914e75-9252-4043-a24b-3d8744d6d179","Type":"ContainerStarted","Data":"bec5bbb4a3024fab9a146a3ad526255800bc95a634bd06bd42ca68a9a8eb5b1b"} Apr 23 15:02:53.342942 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:53.342573 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" Apr 23 15:02:53.343865 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:53.343840 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-kmpqj" event={"ID":"303b1a25-6110-4486-a417-69a196ea5719","Type":"ContainerStarted","Data":"d29cb30cca31b2b2fba7ea5081a4818be6312a06f06d5aadb7095d81d2f6a840"} Apr 23 15:02:53.360759 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:53.360713 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" podStartSLOduration=1.453020645 podStartE2EDuration="4.360671685s" podCreationTimestamp="2026-04-23 15:02:49 +0000 UTC" firstStartedPulling="2026-04-23 15:02:49.492184713 +0000 UTC m=+329.999267080" lastFinishedPulling="2026-04-23 15:02:52.399835742 +0000 UTC m=+332.906918120" observedRunningTime="2026-04-23 15:02:53.358829559 +0000 UTC m=+333.865911956" watchObservedRunningTime="2026-04-23 15:02:53.360671685 +0000 UTC m=+333.867754073" Apr 23 15:02:53.374876 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:53.374826 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-kmpqj" podStartSLOduration=1.60142056 podStartE2EDuration="3.374812943s" podCreationTimestamp="2026-04-23 15:02:50 +0000 UTC" firstStartedPulling="2026-04-23 15:02:50.632164038 +0000 UTC m=+331.139246402" lastFinishedPulling="2026-04-23 15:02:52.405556416 +0000 UTC m=+332.912638785" observedRunningTime="2026-04-23 15:02:53.374041454 +0000 UTC m=+333.881123836" watchObservedRunningTime="2026-04-23 15:02:53.374812943 +0000 UTC m=+333.881895330" Apr 23 15:02:59.349410 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:02:59.349329 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-x67qz" Apr 23 15:03:11.502824 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.502787 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw"] Apr 23 15:03:11.506272 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.506251 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:11.508897 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.508872 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 15:03:11.508897 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.508887 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 15:03:11.509626 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.509613 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x5mt8\"" Apr 23 15:03:11.514774 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.514753 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw"] Apr 23 15:03:11.613585 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.613542 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:11.613585 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.613587 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpddg\" (UniqueName: \"kubernetes.io/projected/3343a73c-e346-40d1-b978-10e66eb532fb-kube-api-access-hpddg\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:11.613818 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.613667 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:11.714451 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.714407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:11.714596 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.714471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpddg\" (UniqueName: \"kubernetes.io/projected/3343a73c-e346-40d1-b978-10e66eb532fb-kube-api-access-hpddg\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:11.714596 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.714509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:11.714888 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.714871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:11.714938 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.714923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:11.723033 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.723003 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpddg\" (UniqueName: \"kubernetes.io/projected/3343a73c-e346-40d1-b978-10e66eb532fb-kube-api-access-hpddg\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:11.816072 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.815976 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:11.937252 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:11.937227 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw"] Apr 23 15:03:11.939752 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:03:11.939721 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3343a73c_e346_40d1_b978_10e66eb532fb.slice/crio-84781809b4ae1d1dd9ec0ec695f79bd980a4f6f5db733f13218fe3b70e42f2b2 WatchSource:0}: Error finding container 84781809b4ae1d1dd9ec0ec695f79bd980a4f6f5db733f13218fe3b70e42f2b2: Status 404 returned error can't find the container with id 84781809b4ae1d1dd9ec0ec695f79bd980a4f6f5db733f13218fe3b70e42f2b2 Apr 23 15:03:12.403880 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:12.403848 2577 generic.go:358] "Generic (PLEG): container finished" podID="3343a73c-e346-40d1-b978-10e66eb532fb" containerID="1a82689a8165c7ba2fb68844f9fcc27ead532d19ff447102d401fec94c537a2e" exitCode=0 Apr 23 15:03:12.404054 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:12.403943 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" event={"ID":"3343a73c-e346-40d1-b978-10e66eb532fb","Type":"ContainerDied","Data":"1a82689a8165c7ba2fb68844f9fcc27ead532d19ff447102d401fec94c537a2e"} Apr 23 15:03:12.404054 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:12.403982 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" event={"ID":"3343a73c-e346-40d1-b978-10e66eb532fb","Type":"ContainerStarted","Data":"84781809b4ae1d1dd9ec0ec695f79bd980a4f6f5db733f13218fe3b70e42f2b2"} Apr 23 15:03:15.415300 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:15.415265 2577 generic.go:358] "Generic (PLEG): container finished" podID="3343a73c-e346-40d1-b978-10e66eb532fb" containerID="f77071e693323fdba96350366bcc247ca35c1e008dfd0ef0ce4f5ec6a6530ab0" exitCode=0 Apr 23 15:03:15.415762 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:15.415350 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" event={"ID":"3343a73c-e346-40d1-b978-10e66eb532fb","Type":"ContainerDied","Data":"f77071e693323fdba96350366bcc247ca35c1e008dfd0ef0ce4f5ec6a6530ab0"} Apr 23 15:03:16.420720 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:16.420660 2577 generic.go:358] "Generic (PLEG): container finished" podID="3343a73c-e346-40d1-b978-10e66eb532fb" containerID="ba786bab0340d26cfaaa70589f01cf1c44eb18c6ba696b897697927939a7c3fe" exitCode=0 Apr 23 15:03:16.421057 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:16.420738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" event={"ID":"3343a73c-e346-40d1-b978-10e66eb532fb","Type":"ContainerDied","Data":"ba786bab0340d26cfaaa70589f01cf1c44eb18c6ba696b897697927939a7c3fe"} Apr 23 15:03:17.542654 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:17.542629 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:17.670334 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:17.670302 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-bundle\") pod \"3343a73c-e346-40d1-b978-10e66eb532fb\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " Apr 23 15:03:17.670501 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:17.670343 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpddg\" (UniqueName: \"kubernetes.io/projected/3343a73c-e346-40d1-b978-10e66eb532fb-kube-api-access-hpddg\") pod \"3343a73c-e346-40d1-b978-10e66eb532fb\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " Apr 23 15:03:17.670501 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:17.670405 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-util\") pod \"3343a73c-e346-40d1-b978-10e66eb532fb\" (UID: \"3343a73c-e346-40d1-b978-10e66eb532fb\") " Apr 23 15:03:17.670766 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:17.670720 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-bundle" (OuterVolumeSpecName: "bundle") pod "3343a73c-e346-40d1-b978-10e66eb532fb" (UID: "3343a73c-e346-40d1-b978-10e66eb532fb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 15:03:17.672374 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:17.672349 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3343a73c-e346-40d1-b978-10e66eb532fb-kube-api-access-hpddg" (OuterVolumeSpecName: "kube-api-access-hpddg") pod "3343a73c-e346-40d1-b978-10e66eb532fb" (UID: "3343a73c-e346-40d1-b978-10e66eb532fb"). InnerVolumeSpecName "kube-api-access-hpddg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 15:03:17.675389 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:17.675312 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-util" (OuterVolumeSpecName: "util") pod "3343a73c-e346-40d1-b978-10e66eb532fb" (UID: "3343a73c-e346-40d1-b978-10e66eb532fb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 15:03:17.771937 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:17.771892 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-bundle\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:03:17.771937 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:17.771932 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hpddg\" (UniqueName: \"kubernetes.io/projected/3343a73c-e346-40d1-b978-10e66eb532fb-kube-api-access-hpddg\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:03:17.771937 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:17.771946 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3343a73c-e346-40d1-b978-10e66eb532fb-util\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:03:18.429453 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:18.429413 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" event={"ID":"3343a73c-e346-40d1-b978-10e66eb532fb","Type":"ContainerDied","Data":"84781809b4ae1d1dd9ec0ec695f79bd980a4f6f5db733f13218fe3b70e42f2b2"} Apr 23 15:03:18.429453 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:18.429455 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84781809b4ae1d1dd9ec0ec695f79bd980a4f6f5db733f13218fe3b70e42f2b2" Apr 23 15:03:18.429659 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:18.429477 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e4q2fw" Apr 23 15:03:23.121883 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.121842 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f"] Apr 23 15:03:23.122544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.122291 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3343a73c-e346-40d1-b978-10e66eb532fb" containerName="pull" Apr 23 15:03:23.122544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.122314 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3343a73c-e346-40d1-b978-10e66eb532fb" containerName="pull" Apr 23 15:03:23.122544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.122331 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3343a73c-e346-40d1-b978-10e66eb532fb" containerName="util" Apr 23 15:03:23.122544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.122339 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3343a73c-e346-40d1-b978-10e66eb532fb" containerName="util" Apr 23 15:03:23.122544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.122357 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3343a73c-e346-40d1-b978-10e66eb532fb" containerName="extract" Apr 23 15:03:23.122544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.122375 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3343a73c-e346-40d1-b978-10e66eb532fb" containerName="extract" Apr 23 15:03:23.122544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.122459 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3343a73c-e346-40d1-b978-10e66eb532fb" containerName="extract" Apr 23 15:03:23.126877 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.126856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f" Apr 23 15:03:23.129188 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.129166 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 23 15:03:23.129428 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.129405 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 23 15:03:23.129925 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.129911 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-9pphg\"" Apr 23 15:03:23.134335 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.134312 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f"] Apr 23 15:03:23.217831 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.217788 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znfdk\" (UniqueName: \"kubernetes.io/projected/a8ac0d8b-3d01-43af-a664-507e367c0b0e-kube-api-access-znfdk\") pod \"jobset-operator-747c5859c7-4hk2f\" (UID: \"a8ac0d8b-3d01-43af-a664-507e367c0b0e\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f" Apr 23 15:03:23.218004 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.217894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8ac0d8b-3d01-43af-a664-507e367c0b0e-tmp\") pod \"jobset-operator-747c5859c7-4hk2f\" (UID: \"a8ac0d8b-3d01-43af-a664-507e367c0b0e\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f" Apr 23 15:03:23.318755 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.318719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znfdk\" (UniqueName: \"kubernetes.io/projected/a8ac0d8b-3d01-43af-a664-507e367c0b0e-kube-api-access-znfdk\") pod \"jobset-operator-747c5859c7-4hk2f\" (UID: \"a8ac0d8b-3d01-43af-a664-507e367c0b0e\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f" Apr 23 15:03:23.318877 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.318779 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8ac0d8b-3d01-43af-a664-507e367c0b0e-tmp\") pod \"jobset-operator-747c5859c7-4hk2f\" (UID: \"a8ac0d8b-3d01-43af-a664-507e367c0b0e\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f" Apr 23 15:03:23.319129 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.319112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8ac0d8b-3d01-43af-a664-507e367c0b0e-tmp\") pod \"jobset-operator-747c5859c7-4hk2f\" (UID: \"a8ac0d8b-3d01-43af-a664-507e367c0b0e\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f" Apr 23 15:03:23.328581 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.328545 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znfdk\" (UniqueName: \"kubernetes.io/projected/a8ac0d8b-3d01-43af-a664-507e367c0b0e-kube-api-access-znfdk\") pod \"jobset-operator-747c5859c7-4hk2f\" (UID: \"a8ac0d8b-3d01-43af-a664-507e367c0b0e\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f" Apr 23 15:03:23.435835 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.435802 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f" Apr 23 15:03:23.576130 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:23.576098 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f"] Apr 23 15:03:23.578437 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:03:23.578413 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8ac0d8b_3d01_43af_a664_507e367c0b0e.slice/crio-d6f8d2770738fb41168373441b03c79c10ad6479db87265c0d1f269d76865045 WatchSource:0}: Error finding container d6f8d2770738fb41168373441b03c79c10ad6479db87265c0d1f269d76865045: Status 404 returned error can't find the container with id d6f8d2770738fb41168373441b03c79c10ad6479db87265c0d1f269d76865045 Apr 23 15:03:24.449814 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:24.449779 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f" event={"ID":"a8ac0d8b-3d01-43af-a664-507e367c0b0e","Type":"ContainerStarted","Data":"d6f8d2770738fb41168373441b03c79c10ad6479db87265c0d1f269d76865045"} Apr 23 15:03:26.457644 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:26.457602 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f" event={"ID":"a8ac0d8b-3d01-43af-a664-507e367c0b0e","Type":"ContainerStarted","Data":"e2799324e502822d9c1d52a710f79709e562026319b8707a52940ce302fddf29"} Apr 23 15:03:26.493861 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:26.493809 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-4hk2f" podStartSLOduration=1.028486861 podStartE2EDuration="3.493794538s" podCreationTimestamp="2026-04-23 15:03:23 +0000 UTC" firstStartedPulling="2026-04-23 15:03:23.579985147 +0000 UTC m=+364.087067512" lastFinishedPulling="2026-04-23 15:03:26.04529281 +0000 UTC m=+366.552375189" observedRunningTime="2026-04-23 15:03:26.492891046 +0000 UTC m=+366.999973432" watchObservedRunningTime="2026-04-23 15:03:26.493794538 +0000 UTC m=+367.000876922" Apr 23 15:03:52.545580 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.545546 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld"] Apr 23 15:03:52.558130 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.558104 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:52.559233 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.559208 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld"] Apr 23 15:03:52.561238 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.561214 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 23 15:03:52.561323 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.561214 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 23 15:03:52.561897 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.561874 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-dwtj6\"" Apr 23 15:03:52.561897 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.561894 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 23 15:03:52.562054 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.561874 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 23 15:03:52.666118 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.666085 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc613cfd-3e25-4570-a930-ebb12df77e8a-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-w24ld\" (UID: \"fc613cfd-3e25-4570-a930-ebb12df77e8a\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:52.666305 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.666153 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9bqs\" (UniqueName: \"kubernetes.io/projected/fc613cfd-3e25-4570-a930-ebb12df77e8a-kube-api-access-h9bqs\") pod \"kubeflow-trainer-controller-manager-55f5694779-w24ld\" (UID: \"fc613cfd-3e25-4570-a930-ebb12df77e8a\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:52.666305 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.666254 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/fc613cfd-3e25-4570-a930-ebb12df77e8a-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-w24ld\" (UID: \"fc613cfd-3e25-4570-a930-ebb12df77e8a\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:52.766977 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.766947 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9bqs\" (UniqueName: \"kubernetes.io/projected/fc613cfd-3e25-4570-a930-ebb12df77e8a-kube-api-access-h9bqs\") pod \"kubeflow-trainer-controller-manager-55f5694779-w24ld\" (UID: \"fc613cfd-3e25-4570-a930-ebb12df77e8a\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:52.767150 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.767008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/fc613cfd-3e25-4570-a930-ebb12df77e8a-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-w24ld\" (UID: \"fc613cfd-3e25-4570-a930-ebb12df77e8a\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:52.767150 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.767057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc613cfd-3e25-4570-a930-ebb12df77e8a-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-w24ld\" (UID: \"fc613cfd-3e25-4570-a930-ebb12df77e8a\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:52.767656 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.767632 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/fc613cfd-3e25-4570-a930-ebb12df77e8a-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-w24ld\" (UID: \"fc613cfd-3e25-4570-a930-ebb12df77e8a\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:52.769453 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.769432 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc613cfd-3e25-4570-a930-ebb12df77e8a-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-w24ld\" (UID: \"fc613cfd-3e25-4570-a930-ebb12df77e8a\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:52.775500 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.775479 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9bqs\" (UniqueName: \"kubernetes.io/projected/fc613cfd-3e25-4570-a930-ebb12df77e8a-kube-api-access-h9bqs\") pod \"kubeflow-trainer-controller-manager-55f5694779-w24ld\" (UID: \"fc613cfd-3e25-4570-a930-ebb12df77e8a\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:52.868783 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.868669 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:52.993595 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:52.993569 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld"] Apr 23 15:03:52.995641 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:03:52.995613 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc613cfd_3e25_4570_a930_ebb12df77e8a.slice/crio-7103cfa2277d063c0065d64f6ea34abff411039c3c69f1b3a7507f228cefc96f WatchSource:0}: Error finding container 7103cfa2277d063c0065d64f6ea34abff411039c3c69f1b3a7507f228cefc96f: Status 404 returned error can't find the container with id 7103cfa2277d063c0065d64f6ea34abff411039c3c69f1b3a7507f228cefc96f Apr 23 15:03:53.546004 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:53.545955 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" event={"ID":"fc613cfd-3e25-4570-a930-ebb12df77e8a","Type":"ContainerStarted","Data":"7103cfa2277d063c0065d64f6ea34abff411039c3c69f1b3a7507f228cefc96f"} Apr 23 15:03:55.555665 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:55.555625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" event={"ID":"fc613cfd-3e25-4570-a930-ebb12df77e8a","Type":"ContainerStarted","Data":"ef1b3bd316a486bcd6bbc0f42932095f9908813b3b3187a4e06a2c2866b48845"} Apr 23 15:03:55.556188 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:55.555684 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:03:55.582183 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:03:55.582122 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" podStartSLOduration=1.482024779 podStartE2EDuration="3.58210451s" podCreationTimestamp="2026-04-23 15:03:52 +0000 UTC" firstStartedPulling="2026-04-23 15:03:52.997355636 +0000 UTC m=+393.504438000" lastFinishedPulling="2026-04-23 15:03:55.097435357 +0000 UTC m=+395.604517731" observedRunningTime="2026-04-23 15:03:55.579356879 +0000 UTC m=+396.086439266" watchObservedRunningTime="2026-04-23 15:03:55.58210451 +0000 UTC m=+396.089186897" Apr 23 15:04:11.564400 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:04:11.564368 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-w24ld" Apr 23 15:05:44.999475 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:44.999439 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882"] Apr 23 15:05:45.001591 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:45.001574 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" Apr 23 15:05:45.003954 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:45.003928 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-8ccrr\"/\"openshift-service-ca.crt\"" Apr 23 15:05:45.004072 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:45.003934 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-8ccrr\"/\"kube-root-ca.crt\"" Apr 23 15:05:45.004664 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:45.004646 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-8ccrr\"/\"default-dockercfg-9cpfb\"" Apr 23 15:05:45.022429 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:45.022397 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882"] Apr 23 15:05:45.082105 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:45.082074 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xqlp\" (UniqueName: \"kubernetes.io/projected/0ee4af63-5eb4-4309-bc38-0d1fd42a7702-kube-api-access-7xqlp\") pod \"test-trainjob-m8fxg-node-0-0-lp882\" (UID: \"0ee4af63-5eb4-4309-bc38-0d1fd42a7702\") " pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" Apr 23 15:05:45.182925 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:45.182892 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xqlp\" (UniqueName: \"kubernetes.io/projected/0ee4af63-5eb4-4309-bc38-0d1fd42a7702-kube-api-access-7xqlp\") pod \"test-trainjob-m8fxg-node-0-0-lp882\" (UID: \"0ee4af63-5eb4-4309-bc38-0d1fd42a7702\") " pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" Apr 23 15:05:45.192253 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:45.192223 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xqlp\" (UniqueName: \"kubernetes.io/projected/0ee4af63-5eb4-4309-bc38-0d1fd42a7702-kube-api-access-7xqlp\") pod \"test-trainjob-m8fxg-node-0-0-lp882\" (UID: \"0ee4af63-5eb4-4309-bc38-0d1fd42a7702\") " pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" Apr 23 15:05:45.310739 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:45.310617 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" Apr 23 15:05:45.435328 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:45.435299 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882"] Apr 23 15:05:45.438028 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:05:45.437997 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ee4af63_5eb4_4309_bc38_0d1fd42a7702.slice/crio-94dcaf01f50a5a5d0f06058310e8e34e7a12ce81b8c70b196a367e4e514e260e WatchSource:0}: Error finding container 94dcaf01f50a5a5d0f06058310e8e34e7a12ce81b8c70b196a367e4e514e260e: Status 404 returned error can't find the container with id 94dcaf01f50a5a5d0f06058310e8e34e7a12ce81b8c70b196a367e4e514e260e Apr 23 15:05:45.911081 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:05:45.911045 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" event={"ID":"0ee4af63-5eb4-4309-bc38-0d1fd42a7702","Type":"ContainerStarted","Data":"94dcaf01f50a5a5d0f06058310e8e34e7a12ce81b8c70b196a367e4e514e260e"} Apr 23 15:07:19.993313 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:07:19.993285 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:07:19.995410 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:07:19.995383 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:10:11.257087 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:10:11.257045 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67649dbb8b-vpg24"] Apr 23 15:10:36.278618 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:10:36.278510 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67649dbb8b-vpg24" podUID="de5a4ec9-f768-4280-8455-53c89217cd36" containerName="console" containerID="cri-o://240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114" gracePeriod=15 Apr 23 15:10:45.045126 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:10:45.045072 2577 patch_prober.go:28] interesting pod/console-67649dbb8b-vpg24 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" start-of-body= Apr 23 15:10:45.045777 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:10:45.045143 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-67649dbb8b-vpg24" podUID="de5a4ec9-f768-4280-8455-53c89217cd36" containerName="console" probeResult="failure" output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" Apr 23 15:10:55.045170 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:10:55.045121 2577 patch_prober.go:28] interesting pod/console-67649dbb8b-vpg24 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" start-of-body= Apr 23 15:10:55.045725 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:10:55.045195 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-67649dbb8b-vpg24" podUID="de5a4ec9-f768-4280-8455-53c89217cd36" containerName="console" probeResult="failure" output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" Apr 23 15:11:03.753595 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.753552 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67649dbb8b-vpg24_de5a4ec9-f768-4280-8455-53c89217cd36/console/0.log" Apr 23 15:11:03.754096 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.753647 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:11:03.847747 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.847686 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-serving-cert\") pod \"de5a4ec9-f768-4280-8455-53c89217cd36\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " Apr 23 15:11:03.847747 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.847746 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-oauth-serving-cert\") pod \"de5a4ec9-f768-4280-8455-53c89217cd36\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " Apr 23 15:11:03.847988 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.847782 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-console-config\") pod \"de5a4ec9-f768-4280-8455-53c89217cd36\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " Apr 23 15:11:03.847988 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.847910 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-trusted-ca-bundle\") pod \"de5a4ec9-f768-4280-8455-53c89217cd36\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " Apr 23 15:11:03.848096 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.847995 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-oauth-config\") pod \"de5a4ec9-f768-4280-8455-53c89217cd36\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " Apr 23 15:11:03.848096 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.848031 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jvcn\" (UniqueName: \"kubernetes.io/projected/de5a4ec9-f768-4280-8455-53c89217cd36-kube-api-access-9jvcn\") pod \"de5a4ec9-f768-4280-8455-53c89217cd36\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " Apr 23 15:11:03.848096 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.848074 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-service-ca\") pod \"de5a4ec9-f768-4280-8455-53c89217cd36\" (UID: \"de5a4ec9-f768-4280-8455-53c89217cd36\") " Apr 23 15:11:03.848248 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.848187 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "de5a4ec9-f768-4280-8455-53c89217cd36" (UID: "de5a4ec9-f768-4280-8455-53c89217cd36"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:11:03.848248 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.848193 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-console-config" (OuterVolumeSpecName: "console-config") pod "de5a4ec9-f768-4280-8455-53c89217cd36" (UID: "de5a4ec9-f768-4280-8455-53c89217cd36"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:11:03.848365 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.848340 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "de5a4ec9-f768-4280-8455-53c89217cd36" (UID: "de5a4ec9-f768-4280-8455-53c89217cd36"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:11:03.848682 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.848648 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-service-ca" (OuterVolumeSpecName: "service-ca") pod "de5a4ec9-f768-4280-8455-53c89217cd36" (UID: "de5a4ec9-f768-4280-8455-53c89217cd36"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 15:11:03.848682 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.848677 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-oauth-serving-cert\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:11:03.848888 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.848722 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-console-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:11:03.848888 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.848738 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-trusted-ca-bundle\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:11:03.850409 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.850364 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "de5a4ec9-f768-4280-8455-53c89217cd36" (UID: "de5a4ec9-f768-4280-8455-53c89217cd36"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 15:11:03.850534 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.850414 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5a4ec9-f768-4280-8455-53c89217cd36-kube-api-access-9jvcn" (OuterVolumeSpecName: "kube-api-access-9jvcn") pod "de5a4ec9-f768-4280-8455-53c89217cd36" (UID: "de5a4ec9-f768-4280-8455-53c89217cd36"). InnerVolumeSpecName "kube-api-access-9jvcn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 15:11:03.850534 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.850430 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "de5a4ec9-f768-4280-8455-53c89217cd36" (UID: "de5a4ec9-f768-4280-8455-53c89217cd36"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 15:11:03.949905 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.949868 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9jvcn\" (UniqueName: \"kubernetes.io/projected/de5a4ec9-f768-4280-8455-53c89217cd36-kube-api-access-9jvcn\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:11:03.949905 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.949899 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de5a4ec9-f768-4280-8455-53c89217cd36-service-ca\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:11:03.949905 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.949910 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-serving-cert\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:11:03.950203 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:03.949920 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de5a4ec9-f768-4280-8455-53c89217cd36-console-oauth-config\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:11:04.032187 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:04.032109 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67649dbb8b-vpg24_de5a4ec9-f768-4280-8455-53c89217cd36/console/0.log" Apr 23 15:11:04.032187 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:04.032148 2577 generic.go:358] "Generic (PLEG): container finished" podID="de5a4ec9-f768-4280-8455-53c89217cd36" containerID="240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114" exitCode=2 Apr 23 15:11:04.032395 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:04.032182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67649dbb8b-vpg24" event={"ID":"de5a4ec9-f768-4280-8455-53c89217cd36","Type":"ContainerDied","Data":"240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114"} Apr 23 15:11:04.032395 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:04.032245 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67649dbb8b-vpg24" event={"ID":"de5a4ec9-f768-4280-8455-53c89217cd36","Type":"ContainerDied","Data":"53a07465246a78b2a56242724024435f1c5300a8b88fa1a51460ce9225132afa"} Apr 23 15:11:04.032395 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:04.032255 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67649dbb8b-vpg24" Apr 23 15:11:04.032395 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:04.032265 2577 scope.go:117] "RemoveContainer" containerID="240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114" Apr 23 15:11:04.042417 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:04.042390 2577 scope.go:117] "RemoveContainer" containerID="240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114" Apr 23 15:11:04.042744 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:11:04.042723 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114\": container with ID starting with 240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114 not found: ID does not exist" containerID="240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114" Apr 23 15:11:04.042791 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:04.042754 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114"} err="failed to get container status \"240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114\": rpc error: code = NotFound desc = could not find container \"240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114\": container with ID starting with 240104f5e0440d43bff64717d40ef40e034bcd9efd198444502cbb9bee2dd114 not found: ID does not exist" Apr 23 15:11:04.053738 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:04.053691 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67649dbb8b-vpg24"] Apr 23 15:11:04.056829 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:04.056806 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67649dbb8b-vpg24"] Apr 23 15:11:04.094810 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:04.094777 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de5a4ec9-f768-4280-8455-53c89217cd36" path="/var/lib/kubelet/pods/de5a4ec9-f768-4280-8455-53c89217cd36/volumes" Apr 23 15:11:05.036756 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:05.036694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" event={"ID":"0ee4af63-5eb4-4309-bc38-0d1fd42a7702","Type":"ContainerStarted","Data":"f739b39522cd1189edb78c4768f3a53477760155ddfd86a2f16273510672571d"} Apr 23 15:11:05.054096 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:05.054050 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" podStartSLOduration=2.206145596 podStartE2EDuration="5m21.054035389s" podCreationTimestamp="2026-04-23 15:05:44 +0000 UTC" firstStartedPulling="2026-04-23 15:05:45.439994871 +0000 UTC m=+505.947077235" lastFinishedPulling="2026-04-23 15:11:04.287884649 +0000 UTC m=+824.794967028" observedRunningTime="2026-04-23 15:11:05.053331998 +0000 UTC m=+825.560414384" watchObservedRunningTime="2026-04-23 15:11:05.054035389 +0000 UTC m=+825.561117776" Apr 23 15:11:10.057315 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:10.057280 2577 generic.go:358] "Generic (PLEG): container finished" podID="0ee4af63-5eb4-4309-bc38-0d1fd42a7702" containerID="f739b39522cd1189edb78c4768f3a53477760155ddfd86a2f16273510672571d" exitCode=0 Apr 23 15:11:10.057745 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:10.057353 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" event={"ID":"0ee4af63-5eb4-4309-bc38-0d1fd42a7702","Type":"ContainerDied","Data":"f739b39522cd1189edb78c4768f3a53477760155ddfd86a2f16273510672571d"} Apr 23 15:11:11.188117 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:11.188091 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" Apr 23 15:11:11.318828 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:11.318750 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xqlp\" (UniqueName: \"kubernetes.io/projected/0ee4af63-5eb4-4309-bc38-0d1fd42a7702-kube-api-access-7xqlp\") pod \"0ee4af63-5eb4-4309-bc38-0d1fd42a7702\" (UID: \"0ee4af63-5eb4-4309-bc38-0d1fd42a7702\") " Apr 23 15:11:11.320929 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:11.320902 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee4af63-5eb4-4309-bc38-0d1fd42a7702-kube-api-access-7xqlp" (OuterVolumeSpecName: "kube-api-access-7xqlp") pod "0ee4af63-5eb4-4309-bc38-0d1fd42a7702" (UID: "0ee4af63-5eb4-4309-bc38-0d1fd42a7702"). InnerVolumeSpecName "kube-api-access-7xqlp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 15:11:11.419759 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:11.419720 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7xqlp\" (UniqueName: \"kubernetes.io/projected/0ee4af63-5eb4-4309-bc38-0d1fd42a7702-kube-api-access-7xqlp\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:11:12.065632 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.065609 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" Apr 23 15:11:12.065811 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.065601 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882" event={"ID":"0ee4af63-5eb4-4309-bc38-0d1fd42a7702","Type":"ContainerDied","Data":"94dcaf01f50a5a5d0f06058310e8e34e7a12ce81b8c70b196a367e4e514e260e"} Apr 23 15:11:12.065811 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.065753 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94dcaf01f50a5a5d0f06058310e8e34e7a12ce81b8c70b196a367e4e514e260e" Apr 23 15:11:12.608421 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.608385 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7"] Apr 23 15:11:12.608803 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.608661 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de5a4ec9-f768-4280-8455-53c89217cd36" containerName="console" Apr 23 15:11:12.608803 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.608672 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5a4ec9-f768-4280-8455-53c89217cd36" containerName="console" Apr 23 15:11:12.608803 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.608689 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ee4af63-5eb4-4309-bc38-0d1fd42a7702" containerName="node" Apr 23 15:11:12.608803 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.608694 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee4af63-5eb4-4309-bc38-0d1fd42a7702" containerName="node" Apr 23 15:11:12.608803 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.608772 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ee4af63-5eb4-4309-bc38-0d1fd42a7702" containerName="node" Apr 23 15:11:12.608803 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.608783 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="de5a4ec9-f768-4280-8455-53c89217cd36" containerName="console" Apr 23 15:11:12.900906 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.900873 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7"] Apr 23 15:11:12.901065 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.900981 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" Apr 23 15:11:12.903484 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.903464 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-gmxv6\"/\"openshift-service-ca.crt\"" Apr 23 15:11:12.904086 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.904067 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-gmxv6\"/\"default-dockercfg-qsg2n\"" Apr 23 15:11:12.904132 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:12.904087 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-gmxv6\"/\"kube-root-ca.crt\"" Apr 23 15:11:13.033301 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:13.033258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2b82\" (UniqueName: \"kubernetes.io/projected/681ce542-0f81-4ec1-84de-2c8953d7dc00-kube-api-access-d2b82\") pod \"test-trainjob-ncx8l-node-0-0-kc4z7\" (UID: \"681ce542-0f81-4ec1-84de-2c8953d7dc00\") " pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" Apr 23 15:11:13.133824 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:13.133788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2b82\" (UniqueName: \"kubernetes.io/projected/681ce542-0f81-4ec1-84de-2c8953d7dc00-kube-api-access-d2b82\") pod \"test-trainjob-ncx8l-node-0-0-kc4z7\" (UID: \"681ce542-0f81-4ec1-84de-2c8953d7dc00\") " pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" Apr 23 15:11:13.142155 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:13.142129 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2b82\" (UniqueName: \"kubernetes.io/projected/681ce542-0f81-4ec1-84de-2c8953d7dc00-kube-api-access-d2b82\") pod \"test-trainjob-ncx8l-node-0-0-kc4z7\" (UID: \"681ce542-0f81-4ec1-84de-2c8953d7dc00\") " pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" Apr 23 15:11:13.210229 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:13.210128 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" Apr 23 15:11:13.335546 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:13.335521 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7"] Apr 23 15:11:13.337980 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:11:13.337944 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod681ce542_0f81_4ec1_84de_2c8953d7dc00.slice/crio-d4b26be5183159aa4fbe6e9f90f1c656ecb28967337651af574577fa6dad149e WatchSource:0}: Error finding container d4b26be5183159aa4fbe6e9f90f1c656ecb28967337651af574577fa6dad149e: Status 404 returned error can't find the container with id d4b26be5183159aa4fbe6e9f90f1c656ecb28967337651af574577fa6dad149e Apr 23 15:11:13.339991 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:13.339972 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 15:11:14.073778 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:11:14.073740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" event={"ID":"681ce542-0f81-4ec1-84de-2c8953d7dc00","Type":"ContainerStarted","Data":"d4b26be5183159aa4fbe6e9f90f1c656ecb28967337651af574577fa6dad149e"} Apr 23 15:12:20.025813 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:12:20.025782 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:12:20.025813 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:12:20.025802 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:15:53.049539 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:15:53.049446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" event={"ID":"681ce542-0f81-4ec1-84de-2c8953d7dc00","Type":"ContainerStarted","Data":"f67bf08debff0e2bbc95a9eb7927033bed62db75d323f4cab20f7d14154e34ea"} Apr 23 15:15:53.076818 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:15:53.076767 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" podStartSLOduration=1.7430794760000001 podStartE2EDuration="4m41.076750211s" podCreationTimestamp="2026-04-23 15:11:12 +0000 UTC" firstStartedPulling="2026-04-23 15:11:13.340099817 +0000 UTC m=+833.847182181" lastFinishedPulling="2026-04-23 15:15:52.673770553 +0000 UTC m=+1113.180852916" observedRunningTime="2026-04-23 15:15:53.075595942 +0000 UTC m=+1113.582678328" watchObservedRunningTime="2026-04-23 15:15:53.076750211 +0000 UTC m=+1113.583832598" Apr 23 15:15:59.070451 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:15:59.070415 2577 generic.go:358] "Generic (PLEG): container finished" podID="681ce542-0f81-4ec1-84de-2c8953d7dc00" containerID="f67bf08debff0e2bbc95a9eb7927033bed62db75d323f4cab20f7d14154e34ea" exitCode=0 Apr 23 15:15:59.070922 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:15:59.070491 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" event={"ID":"681ce542-0f81-4ec1-84de-2c8953d7dc00","Type":"ContainerDied","Data":"f67bf08debff0e2bbc95a9eb7927033bed62db75d323f4cab20f7d14154e34ea"} Apr 23 15:16:00.195758 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:00.195737 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" Apr 23 15:16:00.375381 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:00.375302 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2b82\" (UniqueName: \"kubernetes.io/projected/681ce542-0f81-4ec1-84de-2c8953d7dc00-kube-api-access-d2b82\") pod \"681ce542-0f81-4ec1-84de-2c8953d7dc00\" (UID: \"681ce542-0f81-4ec1-84de-2c8953d7dc00\") " Apr 23 15:16:00.377515 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:00.377477 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/681ce542-0f81-4ec1-84de-2c8953d7dc00-kube-api-access-d2b82" (OuterVolumeSpecName: "kube-api-access-d2b82") pod "681ce542-0f81-4ec1-84de-2c8953d7dc00" (UID: "681ce542-0f81-4ec1-84de-2c8953d7dc00"). InnerVolumeSpecName "kube-api-access-d2b82". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 15:16:00.476233 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:00.476192 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d2b82\" (UniqueName: \"kubernetes.io/projected/681ce542-0f81-4ec1-84de-2c8953d7dc00-kube-api-access-d2b82\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:16:01.078482 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:01.078451 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" Apr 23 15:16:01.078482 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:01.078476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7" event={"ID":"681ce542-0f81-4ec1-84de-2c8953d7dc00","Type":"ContainerDied","Data":"d4b26be5183159aa4fbe6e9f90f1c656ecb28967337651af574577fa6dad149e"} Apr 23 15:16:01.078684 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:01.078506 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4b26be5183159aa4fbe6e9f90f1c656ecb28967337651af574577fa6dad149e" Apr 23 15:16:02.069552 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.069520 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7"] Apr 23 15:16:02.069936 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.069815 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="681ce542-0f81-4ec1-84de-2c8953d7dc00" containerName="node" Apr 23 15:16:02.069936 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.069827 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="681ce542-0f81-4ec1-84de-2c8953d7dc00" containerName="node" Apr 23 15:16:02.069936 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.069890 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="681ce542-0f81-4ec1-84de-2c8953d7dc00" containerName="node" Apr 23 15:16:02.398891 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.398857 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" Apr 23 15:16:02.401953 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.401928 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-ljvbf\"/\"openshift-service-ca.crt\"" Apr 23 15:16:02.402111 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.402088 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-ljvbf\"/\"default-dockercfg-r8c6g\"" Apr 23 15:16:02.403004 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.402984 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-ljvbf\"/\"kube-root-ca.crt\"" Apr 23 15:16:02.404945 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.404926 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7"] Apr 23 15:16:02.495891 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.495857 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rbv7\" (UniqueName: \"kubernetes.io/projected/bbe949fc-4b43-40fa-a70d-954967e42af1-kube-api-access-2rbv7\") pod \"test-trainjob-7d876-node-0-0-ts9r7\" (UID: \"bbe949fc-4b43-40fa-a70d-954967e42af1\") " pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" Apr 23 15:16:02.597178 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.597139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rbv7\" (UniqueName: \"kubernetes.io/projected/bbe949fc-4b43-40fa-a70d-954967e42af1-kube-api-access-2rbv7\") pod \"test-trainjob-7d876-node-0-0-ts9r7\" (UID: \"bbe949fc-4b43-40fa-a70d-954967e42af1\") " pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" Apr 23 15:16:02.606321 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.606296 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rbv7\" (UniqueName: \"kubernetes.io/projected/bbe949fc-4b43-40fa-a70d-954967e42af1-kube-api-access-2rbv7\") pod \"test-trainjob-7d876-node-0-0-ts9r7\" (UID: \"bbe949fc-4b43-40fa-a70d-954967e42af1\") " pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" Apr 23 15:16:02.713345 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.713259 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" Apr 23 15:16:02.840687 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:02.840653 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7"] Apr 23 15:16:02.843714 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:16:02.843659 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbe949fc_4b43_40fa_a70d_954967e42af1.slice/crio-720ca5899ea68c67493d4f4e63553bf31c6a845a2d4e59f064ecd50936766acb WatchSource:0}: Error finding container 720ca5899ea68c67493d4f4e63553bf31c6a845a2d4e59f064ecd50936766acb: Status 404 returned error can't find the container with id 720ca5899ea68c67493d4f4e63553bf31c6a845a2d4e59f064ecd50936766acb Apr 23 15:16:03.086584 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:16:03.086491 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" event={"ID":"bbe949fc-4b43-40fa-a70d-954967e42af1","Type":"ContainerStarted","Data":"720ca5899ea68c67493d4f4e63553bf31c6a845a2d4e59f064ecd50936766acb"} Apr 23 15:17:37.280901 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:37.280868 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:17:37.281446 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:37.280987 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:17:38.425908 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:38.425873 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" event={"ID":"bbe949fc-4b43-40fa-a70d-954967e42af1","Type":"ContainerStarted","Data":"9cb75c8099ece9890e7560fe5fb945ec40cb5be06b2a75500db30976de464374"} Apr 23 15:17:38.447465 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:38.447408 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" podStartSLOduration=1.478694334 podStartE2EDuration="1m36.447386853s" podCreationTimestamp="2026-04-23 15:16:02 +0000 UTC" firstStartedPulling="2026-04-23 15:16:02.845526325 +0000 UTC m=+1123.352608689" lastFinishedPulling="2026-04-23 15:17:37.814218841 +0000 UTC m=+1218.321301208" observedRunningTime="2026-04-23 15:17:38.444294626 +0000 UTC m=+1218.951377014" watchObservedRunningTime="2026-04-23 15:17:38.447386853 +0000 UTC m=+1218.954469241" Apr 23 15:17:41.436962 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:41.436928 2577 generic.go:358] "Generic (PLEG): container finished" podID="bbe949fc-4b43-40fa-a70d-954967e42af1" containerID="9cb75c8099ece9890e7560fe5fb945ec40cb5be06b2a75500db30976de464374" exitCode=0 Apr 23 15:17:41.436962 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:41.436965 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" event={"ID":"bbe949fc-4b43-40fa-a70d-954967e42af1","Type":"ContainerDied","Data":"9cb75c8099ece9890e7560fe5fb945ec40cb5be06b2a75500db30976de464374"} Apr 23 15:17:42.563423 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:42.563395 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" Apr 23 15:17:42.680155 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:42.680117 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rbv7\" (UniqueName: \"kubernetes.io/projected/bbe949fc-4b43-40fa-a70d-954967e42af1-kube-api-access-2rbv7\") pod \"bbe949fc-4b43-40fa-a70d-954967e42af1\" (UID: \"bbe949fc-4b43-40fa-a70d-954967e42af1\") " Apr 23 15:17:42.682326 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:42.682292 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe949fc-4b43-40fa-a70d-954967e42af1-kube-api-access-2rbv7" (OuterVolumeSpecName: "kube-api-access-2rbv7") pod "bbe949fc-4b43-40fa-a70d-954967e42af1" (UID: "bbe949fc-4b43-40fa-a70d-954967e42af1"). InnerVolumeSpecName "kube-api-access-2rbv7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 15:17:42.781757 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:42.781645 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2rbv7\" (UniqueName: \"kubernetes.io/projected/bbe949fc-4b43-40fa-a70d-954967e42af1-kube-api-access-2rbv7\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:17:43.444056 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:43.444024 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" Apr 23 15:17:43.444235 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:43.444028 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7" event={"ID":"bbe949fc-4b43-40fa-a70d-954967e42af1","Type":"ContainerDied","Data":"720ca5899ea68c67493d4f4e63553bf31c6a845a2d4e59f064ecd50936766acb"} Apr 23 15:17:43.444235 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:43.444129 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="720ca5899ea68c67493d4f4e63553bf31c6a845a2d4e59f064ecd50936766acb" Apr 23 15:17:44.075629 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.075596 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9"] Apr 23 15:17:44.076048 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.075904 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bbe949fc-4b43-40fa-a70d-954967e42af1" containerName="node" Apr 23 15:17:44.076048 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.075916 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe949fc-4b43-40fa-a70d-954967e42af1" containerName="node" Apr 23 15:17:44.076048 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.075988 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="bbe949fc-4b43-40fa-a70d-954967e42af1" containerName="node" Apr 23 15:17:44.145035 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.144998 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9"] Apr 23 15:17:44.145191 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.145149 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" Apr 23 15:17:44.147791 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.147748 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-4lkhz\"/\"openshift-service-ca.crt\"" Apr 23 15:17:44.147927 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.147832 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-4lkhz\"/\"kube-root-ca.crt\"" Apr 23 15:17:44.147927 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.147879 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-4lkhz\"/\"default-dockercfg-k5cjl\"" Apr 23 15:17:44.294295 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.294251 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk772\" (UniqueName: \"kubernetes.io/projected/ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7-kube-api-access-kk772\") pod \"test-trainjob-xp5x6-node-0-0-sg5c9\" (UID: \"ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7\") " pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" Apr 23 15:17:44.395550 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.395518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kk772\" (UniqueName: \"kubernetes.io/projected/ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7-kube-api-access-kk772\") pod \"test-trainjob-xp5x6-node-0-0-sg5c9\" (UID: \"ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7\") " pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" Apr 23 15:17:44.404010 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.403969 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk772\" (UniqueName: \"kubernetes.io/projected/ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7-kube-api-access-kk772\") pod \"test-trainjob-xp5x6-node-0-0-sg5c9\" (UID: \"ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7\") " pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" Apr 23 15:17:44.454808 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.454780 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" Apr 23 15:17:44.576976 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.576919 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9"] Apr 23 15:17:44.579407 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:17:44.579363 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac2dcd0a_5552_4f39_9a71_4f59a55e7ec7.slice/crio-275dcf5e26221228b2f44c99dea41cb2536eb430fed140a7d36ad1e1551bbd40 WatchSource:0}: Error finding container 275dcf5e26221228b2f44c99dea41cb2536eb430fed140a7d36ad1e1551bbd40: Status 404 returned error can't find the container with id 275dcf5e26221228b2f44c99dea41cb2536eb430fed140a7d36ad1e1551bbd40 Apr 23 15:17:44.581892 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:44.581876 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 15:17:45.454582 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:17:45.454527 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" event={"ID":"ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7","Type":"ContainerStarted","Data":"275dcf5e26221228b2f44c99dea41cb2536eb430fed140a7d36ad1e1551bbd40"} Apr 23 15:22:37.304039 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:22:37.303959 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:22:37.305237 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:22:37.305216 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:25:33.099876 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:33.099830 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" event={"ID":"ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7","Type":"ContainerStarted","Data":"f24a4e016dd2bfdc4c75084a7a66d9cd2e4ff9da28d1777c0dfc2c3e394752e9"} Apr 23 15:25:33.102488 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:33.102464 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-4lkhz\"/\"default-dockercfg-k5cjl\"" Apr 23 15:25:33.130207 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:33.130104 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" podStartSLOduration=0.856652202 podStartE2EDuration="7m49.130087936s" podCreationTimestamp="2026-04-23 15:17:44 +0000 UTC" firstStartedPulling="2026-04-23 15:17:44.582009133 +0000 UTC m=+1225.089091500" lastFinishedPulling="2026-04-23 15:25:32.855444866 +0000 UTC m=+1693.362527234" observedRunningTime="2026-04-23 15:25:33.12826913 +0000 UTC m=+1693.635351516" watchObservedRunningTime="2026-04-23 15:25:33.130087936 +0000 UTC m=+1693.637170322" Apr 23 15:25:33.221542 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:33.221511 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-4lkhz\"/\"kube-root-ca.crt\"" Apr 23 15:25:33.231968 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:33.231945 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-4lkhz\"/\"openshift-service-ca.crt\"" Apr 23 15:25:38.117450 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:38.117415 2577 generic.go:358] "Generic (PLEG): container finished" podID="ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7" containerID="f24a4e016dd2bfdc4c75084a7a66d9cd2e4ff9da28d1777c0dfc2c3e394752e9" exitCode=0 Apr 23 15:25:38.117904 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:38.117460 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" event={"ID":"ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7","Type":"ContainerDied","Data":"f24a4e016dd2bfdc4c75084a7a66d9cd2e4ff9da28d1777c0dfc2c3e394752e9"} Apr 23 15:25:39.261630 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:39.261605 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" Apr 23 15:25:39.373832 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:39.373749 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk772\" (UniqueName: \"kubernetes.io/projected/ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7-kube-api-access-kk772\") pod \"ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7\" (UID: \"ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7\") " Apr 23 15:25:39.375958 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:39.375933 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7-kube-api-access-kk772" (OuterVolumeSpecName: "kube-api-access-kk772") pod "ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7" (UID: "ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7"). InnerVolumeSpecName "kube-api-access-kk772". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 15:25:39.474633 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:39.474597 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kk772\" (UniqueName: \"kubernetes.io/projected/ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7-kube-api-access-kk772\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:25:40.125418 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.125385 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" Apr 23 15:25:40.125418 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.125393 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9" event={"ID":"ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7","Type":"ContainerDied","Data":"275dcf5e26221228b2f44c99dea41cb2536eb430fed140a7d36ad1e1551bbd40"} Apr 23 15:25:40.125418 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.125421 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="275dcf5e26221228b2f44c99dea41cb2536eb430fed140a7d36ad1e1551bbd40" Apr 23 15:25:40.583664 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.583622 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds"] Apr 23 15:25:40.584162 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.584052 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7" containerName="node" Apr 23 15:25:40.584162 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.584070 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7" containerName="node" Apr 23 15:25:40.584162 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.584154 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7" containerName="node" Apr 23 15:25:40.607403 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.607365 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds"] Apr 23 15:25:40.607570 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.607498 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" Apr 23 15:25:40.610069 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.610041 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-slrp6\"/\"openshift-service-ca.crt\"" Apr 23 15:25:40.610322 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.610299 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-slrp6\"/\"kube-root-ca.crt\"" Apr 23 15:25:40.610880 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.610862 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-slrp6\"/\"default-dockercfg-7vmck\"" Apr 23 15:25:40.683713 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.683665 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4pk4\" (UniqueName: \"kubernetes.io/projected/2b47a718-a9d0-4264-a3a0-4453dc2cadea-kube-api-access-x4pk4\") pod \"test-trainjob-9vkfq-node-0-0-mm8ds\" (UID: \"2b47a718-a9d0-4264-a3a0-4453dc2cadea\") " pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" Apr 23 15:25:40.784695 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.784646 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4pk4\" (UniqueName: \"kubernetes.io/projected/2b47a718-a9d0-4264-a3a0-4453dc2cadea-kube-api-access-x4pk4\") pod \"test-trainjob-9vkfq-node-0-0-mm8ds\" (UID: \"2b47a718-a9d0-4264-a3a0-4453dc2cadea\") " pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" Apr 23 15:25:40.793648 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.793618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4pk4\" (UniqueName: \"kubernetes.io/projected/2b47a718-a9d0-4264-a3a0-4453dc2cadea-kube-api-access-x4pk4\") pod \"test-trainjob-9vkfq-node-0-0-mm8ds\" (UID: \"2b47a718-a9d0-4264-a3a0-4453dc2cadea\") " pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" Apr 23 15:25:40.919044 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:40.919014 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" Apr 23 15:25:41.044296 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:41.044271 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds"] Apr 23 15:25:41.050451 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:25:41.050417 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b47a718_a9d0_4264_a3a0_4453dc2cadea.slice/crio-3f9c31a11db588f260daedffd639e0b68d9057fd749e0a41bab37a4a3bce5244 WatchSource:0}: Error finding container 3f9c31a11db588f260daedffd639e0b68d9057fd749e0a41bab37a4a3bce5244: Status 404 returned error can't find the container with id 3f9c31a11db588f260daedffd639e0b68d9057fd749e0a41bab37a4a3bce5244 Apr 23 15:25:41.052766 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:41.052746 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 15:25:41.131243 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:25:41.131206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" event={"ID":"2b47a718-a9d0-4264-a3a0-4453dc2cadea","Type":"ContainerStarted","Data":"3f9c31a11db588f260daedffd639e0b68d9057fd749e0a41bab37a4a3bce5244"} Apr 23 15:27:37.326742 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:27:37.326715 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:27:37.328342 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:27:37.328322 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:33:25.824788 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:33:25.824750 2577 eviction_manager.go:376] "Eviction manager: attempting to reclaim" resourceName="ephemeral-storage" Apr 23 15:33:25.906467 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:33:25.824811 2577 container_gc.go:86] "Attempting to delete unused containers" Apr 23 15:33:25.906467 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:33:25.826482 2577 scope.go:117] "RemoveContainer" containerID="f77071e693323fdba96350366bcc247ca35c1e008dfd0ef0ce4f5ec6a6530ab0" Apr 23 15:33:34.677022 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:33:34.676994 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasDiskPressure" Apr 23 15:34:32.012045 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:34:32.011889 2577 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 15:34:32.012045 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:34:32.011954 2577 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 15:34:32.012045 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:34:32.011968 2577 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 15:34:37.329085 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:34:37.329048 2577 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 23 15:34:37.329085 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:34:37.329088 2577 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 23 15:34:37.329553 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:34:37.329099 2577 image_gc_manager.go:222] "Failed to monitor images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 23 15:34:37.332223 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:34:37.332182 2577 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 15:34:37.332223 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:34:37.332228 2577 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 15:34:37.332402 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:34:37.332240 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 15:35:25.827488 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:35:25.827403 2577 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="f77071e693323fdba96350366bcc247ca35c1e008dfd0ef0ce4f5ec6a6530ab0" Apr 23 15:35:25.827488 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:35:25.827475 2577 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" containerID="f77071e693323fdba96350366bcc247ca35c1e008dfd0ef0ce4f5ec6a6530ab0" Apr 23 15:35:25.827488 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:35:25.827496 2577 scope.go:117] "RemoveContainer" containerID="a96b177083b479788a5e378b7e5a9eb8bfec3275e3b89c1c4a1aa89b8f6b5738" Apr 23 15:37:02.012769 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:37:02.012714 2577 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 15:37:02.012769 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:37:02.012770 2577 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 15:37:02.013259 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:02.012780 2577 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 15:37:23.573789 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.573361 2577 scope.go:117] "RemoveContainer" containerID="f24a4e016dd2bfdc4c75084a7a66d9cd2e4ff9da28d1777c0dfc2c3e394752e9" Apr 23 15:37:23.630429 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.630309 2577 scope.go:117] "RemoveContainer" containerID="9cb75c8099ece9890e7560fe5fb945ec40cb5be06b2a75500db30976de464374" Apr 23 15:37:23.675833 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.675808 2577 scope.go:117] "RemoveContainer" containerID="f739b39522cd1189edb78c4768f3a53477760155ddfd86a2f16273510672571d" Apr 23 15:37:23.698294 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.698272 2577 scope.go:117] "RemoveContainer" containerID="1a82689a8165c7ba2fb68844f9fcc27ead532d19ff447102d401fec94c537a2e" Apr 23 15:37:23.706195 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.706172 2577 scope.go:117] "RemoveContainer" containerID="ba786bab0340d26cfaaa70589f01cf1c44eb18c6ba696b897697927939a7c3fe" Apr 23 15:37:23.713897 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.713873 2577 scope.go:117] "RemoveContainer" containerID="f67bf08debff0e2bbc95a9eb7927033bed62db75d323f4cab20f7d14154e34ea" Apr 23 15:37:23.731501 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.731477 2577 scope.go:117] "RemoveContainer" containerID="e782a800b74cf9534a5ee9c55e14bd755b4c1dcf418d4ba0dca3b769db345e4b" Apr 23 15:37:23.776661 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.776638 2577 scope.go:117] "RemoveContainer" containerID="8a7cfe6b3a14e37036f1598a345dbe8a08f1688d0085354b0ee9eb2936556815" Apr 23 15:37:23.806881 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.806862 2577 image_gc_manager.go:447] "Attempting to delete unused images" Apr 23 15:37:23.824601 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.824531 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:37:23.826252 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.826226 2577 image_gc_manager.go:514] "Removing image to free bytes" imageID="5151a6030289f6d1ef2c984ebd3e465632a3bf64de79db6f7b3d6e2e638b0557" size=1065600018 runtimeHandler="" Apr 23 15:37:23.917910 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:23.917877 2577 image_gc_manager.go:514] "Removing image to free bytes" imageID="ac4be6c7a52584c773ae754a4ccfb9fb1db440f4c9d858ad0f78765a85625b4b" size=1065006420 runtimeHandler="" Apr 23 15:37:24.299327 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:37:24.299276 2577 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_Fp16Alt_Contraction_l_Ailk_Bjlk_Cijk_Dijk_CU104_gfx90a.co: no space left on device); artifact err: provided artifact is a container image" image="quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6" Apr 23 15:37:24.299542 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:37:24.299488 2577 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:node,Image:quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6,Command:[python -c import torch; print(f'PyTorch version: {torch.__version__}'); print('Training completed successfully')],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:,HostPort:0,ContainerPort:29500,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:PET_NNODES,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NPROC_PER_NODE,Value:1,ValueFrom:nil,},EnvVar{Name:PET_NODE_RANK,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:PET_MASTER_ADDR,Value:test-trainjob-9vkfq-node-0-0.test-trainjob-9vkfq,ValueFrom:nil,},EnvVar{Name:PET_MASTER_PORT,Value:29500,ValueFrom:nil,},EnvVar{Name:JOB_COMPLETION_INDEX,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.labels['batch.kubernetes.io/job-completion-index'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x4pk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000710000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-trainjob-9vkfq-node-0-0-mm8ds_test-ns-slrp6(2b47a718-a9d0-4264-a3a0-4453dc2cadea): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\"/\"\"/\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_Fp16Alt_Contraction_l_Ailk_Bjlk_Cijk_Dijk_CU104_gfx90a.co: no space left on device); artifact err: provided artifact is a container image" logger="UnhandledError" Apr 23 15:37:24.300713 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:37:24.300664 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_Fp16Alt_Contraction_l_Ailk_Bjlk_Cijk_Dijk_CU104_gfx90a.co: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" podUID="2b47a718-a9d0-4264-a3a0-4453dc2cadea" Apr 23 15:37:24.395745 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:24.395682 2577 image_gc_manager.go:514] "Removing image to free bytes" imageID="8cfae5f12a3d5e8f5711d1531d223358c13a3d4b36be844d8c6890efdfa09339" size=622989096 runtimeHandler="" Apr 23 15:37:24.455046 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:24.455012 2577 image_gc_manager.go:514] "Removing image to free bytes" imageID="ec845ac5d8f1d4c74cbd447a93360fa7b8b615723fab3a377882708da6009878" size=977364430 runtimeHandler="" Apr 23 15:37:24.474485 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:24.474448 2577 image_gc_manager.go:514] "Removing image to free bytes" imageID="f9c8cb14b315efbe3847333b6de717d1a52318bb05b38cce743926641075fbb5" size=884076775 runtimeHandler="" Apr 23 15:37:24.516845 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:24.516810 2577 image_gc_manager.go:514] "Removing image to free bytes" imageID="ba0d5ab4eb24f99d84ae4923fefa85e3ab5042c1e554dcca3a41789529499633" size=107183730 runtimeHandler="" Apr 23 15:37:24.548509 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:24.548474 2577 image_gc_manager.go:514] "Removing image to free bytes" imageID="df7311fe93e730dc6d3d65a73c992b1583cc3d49b2e20975439f4718eb9ac4f5" size=108503547 runtimeHandler="" Apr 23 15:37:24.555661 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:24.555605 2577 image_gc_manager.go:514] "Removing image to free bytes" imageID="bd2f0c6a473dfa650b536cfe1992446bf45305b3ace698398143f161694113a5" size=20806872103 runtimeHandler="" Apr 23 15:37:24.607269 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:24.607237 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-slrp6\"/\"default-dockercfg-7vmck\"" Apr 23 15:37:24.801878 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:24.801847 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-slrp6\"/\"kube-root-ca.crt\"" Apr 23 15:37:24.812840 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:24.812767 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-slrp6\"/\"openshift-service-ca.crt\"" Apr 23 15:37:28.403305 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:28.402770 2577 image_gc_manager.go:514] "Removing image to free bytes" imageID="7e65b8288e37c3f4fac04e8bf51240765caae34795b317d44d5399762a08b761" size=23201654702 runtimeHandler="" Apr 23 15:37:28.403305 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:28.403217 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 15:37:28.403633 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:37:28.403499 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/odh-training-rocm64-torch29-py312@sha256:8a053c8ee3a4c326b745b2516a291c6b8a6e92defc5406ac2e9590bb742153f6\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: writing blob: adding layer with blob \\\"sha256:56a6a09b03e81131ca690210efc814701da032525ada07591ffe5d6c5d5a4906\\\"/\\\"\\\"/\\\"sha256:0832a7269a80e5ed5e1c5c749a8d30a6b248ce9b970d81f61a4f93a3b72673f9\\\": unpacking failed (error: exit status 1; output: write /opt/rocm-6.4.3/lib/rocblas/library/TensileLibrary_Type_HH_HPA_Fp16Alt_Contraction_l_Ailk_Bjlk_Cijk_Dijk_CU104_gfx90a.co: no space left on device); artifact err: provided artifact is a container image\"" pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" podUID="2b47a718-a9d0-4264-a3a0-4453dc2cadea" Apr 23 15:37:32.421778 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:32.421733 2577 image_gc_manager.go:514] "Removing image to free bytes" imageID="66d35fa201a713a928a7e709b4d9a882168ca54d7fe3379f8cb47cc33340fd63" size=7588072890 runtimeHandler="" Apr 23 15:37:35.796627 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:35.796571 2577 image_gc_manager.go:514] "Removing image to free bytes" imageID="819e15fdec92d846e6d5de4b1b2988adcb74f6d3046689fe03c655b03a67975d" size=18873458221 runtimeHandler="" Apr 23 15:37:38.671899 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:37:38.671871 2577 eviction_manager.go:383] "Eviction manager: able to reduce resource pressure without evicting pods." resourceName="ephemeral-storage" Apr 23 15:38:31.535235 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:38:31.535204 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 15:39:37.348318 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:39:37.348288 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:39:37.348893 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:39:37.348484 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:39:37.351417 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:39:37.351396 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 15:44:38.695583 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:44:38.695528 2577 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 15:44:38.695583 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:44:38.695585 2577 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 15:44:38.754008 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:44:38.695599 2577 image_gc_manager.go:230] "Failed to update image list" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 15:46:37.351073 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:46:37.350959 2577 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" filter="nil" Apr 23 15:46:37.351073 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:46:37.351019 2577 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 15:46:37.351073 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:46:37.351035 2577 image_gc_manager.go:222] "Failed to monitor images" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Apr 23 15:46:37.353180 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:46:37.353155 2577 log.go:32] "ListImages with filter from image service failed" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" filter="nil" Apr 23 15:46:37.353256 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:46:37.353186 2577 kuberuntime_image.go:104] "Failed to list images" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 23 15:46:37.353256 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:46:37.353196 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="rpc error: code = DeadlineExceeded desc = stream terminated by RST_STREAM with error code: CANCEL" Apr 23 15:46:45.472351 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:46:45.472318 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" event={"ID":"2b47a718-a9d0-4264-a3a0-4453dc2cadea","Type":"ContainerStarted","Data":"10622335d114d972883946fb592d24df8be929070ccee0031c02c1d20b715fc7"} Apr 23 15:46:45.474799 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:46:45.474780 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-slrp6\"/\"default-dockercfg-7vmck\"" Apr 23 15:46:45.500051 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:46:45.500001 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" podStartSLOduration=1.239918506 podStartE2EDuration="21m5.499985198s" podCreationTimestamp="2026-04-23 15:25:40 +0000 UTC" firstStartedPulling="2026-04-23 15:25:41.052960511 +0000 UTC m=+1701.560042889" lastFinishedPulling="2026-04-23 15:46:45.313027214 +0000 UTC m=+2965.820109581" observedRunningTime="2026-04-23 15:46:45.499116495 +0000 UTC m=+2966.006198881" watchObservedRunningTime="2026-04-23 15:46:45.499985198 +0000 UTC m=+2966.007067585" Apr 23 15:46:45.591300 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:46:45.591216 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-slrp6\"/\"kube-root-ca.crt\"" Apr 23 15:46:45.601731 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:46:45.601691 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-slrp6\"/\"openshift-service-ca.crt\"" Apr 23 15:47:07.538842 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:07.538805 2577 generic.go:358] "Generic (PLEG): container finished" podID="2b47a718-a9d0-4264-a3a0-4453dc2cadea" containerID="10622335d114d972883946fb592d24df8be929070ccee0031c02c1d20b715fc7" exitCode=0 Apr 23 15:47:07.539275 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:07.538869 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" event={"ID":"2b47a718-a9d0-4264-a3a0-4453dc2cadea","Type":"ContainerDied","Data":"10622335d114d972883946fb592d24df8be929070ccee0031c02c1d20b715fc7"} Apr 23 15:47:08.713321 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:08.713298 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" Apr 23 15:47:08.790662 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:08.790626 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4pk4\" (UniqueName: \"kubernetes.io/projected/2b47a718-a9d0-4264-a3a0-4453dc2cadea-kube-api-access-x4pk4\") pod \"2b47a718-a9d0-4264-a3a0-4453dc2cadea\" (UID: \"2b47a718-a9d0-4264-a3a0-4453dc2cadea\") " Apr 23 15:47:08.792861 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:08.792801 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b47a718-a9d0-4264-a3a0-4453dc2cadea-kube-api-access-x4pk4" (OuterVolumeSpecName: "kube-api-access-x4pk4") pod "2b47a718-a9d0-4264-a3a0-4453dc2cadea" (UID: "2b47a718-a9d0-4264-a3a0-4453dc2cadea"). InnerVolumeSpecName "kube-api-access-x4pk4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 15:47:08.891813 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:08.891774 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x4pk4\" (UniqueName: \"kubernetes.io/projected/2b47a718-a9d0-4264-a3a0-4453dc2cadea-kube-api-access-x4pk4\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:47:09.546898 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:09.546867 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" Apr 23 15:47:09.546898 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:09.546883 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds" event={"ID":"2b47a718-a9d0-4264-a3a0-4453dc2cadea","Type":"ContainerDied","Data":"3f9c31a11db588f260daedffd639e0b68d9057fd749e0a41bab37a4a3bce5244"} Apr 23 15:47:09.547103 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:09.546914 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f9c31a11db588f260daedffd639e0b68d9057fd749e0a41bab37a4a3bce5244" Apr 23 15:47:10.653931 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:10.653904 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-slrp6_test-trainjob-9vkfq-node-0-0-mm8ds_2b47a718-a9d0-4264-a3a0-4453dc2cadea/node/0.log" Apr 23 15:47:10.746539 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:47:10.746502 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24a4e016dd2bfdc4c75084a7a66d9cd2e4ff9da28d1777c0dfc2c3e394752e9\": container with ID starting with f24a4e016dd2bfdc4c75084a7a66d9cd2e4ff9da28d1777c0dfc2c3e394752e9 not found: ID does not exist" containerID="f24a4e016dd2bfdc4c75084a7a66d9cd2e4ff9da28d1777c0dfc2c3e394752e9" Apr 23 15:47:10.839640 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:47:10.839604 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cb75c8099ece9890e7560fe5fb945ec40cb5be06b2a75500db30976de464374\": container with ID starting with 9cb75c8099ece9890e7560fe5fb945ec40cb5be06b2a75500db30976de464374 not found: ID does not exist" containerID="9cb75c8099ece9890e7560fe5fb945ec40cb5be06b2a75500db30976de464374" Apr 23 15:47:10.940307 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:47:10.940224 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f67bf08debff0e2bbc95a9eb7927033bed62db75d323f4cab20f7d14154e34ea\": container with ID starting with f67bf08debff0e2bbc95a9eb7927033bed62db75d323f4cab20f7d14154e34ea not found: ID does not exist" containerID="f67bf08debff0e2bbc95a9eb7927033bed62db75d323f4cab20f7d14154e34ea" Apr 23 15:47:11.435482 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:47:11.435447 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f739b39522cd1189edb78c4768f3a53477760155ddfd86a2f16273510672571d\": container with ID starting with f739b39522cd1189edb78c4768f3a53477760155ddfd86a2f16273510672571d not found: ID does not exist" containerID="f739b39522cd1189edb78c4768f3a53477760155ddfd86a2f16273510672571d" Apr 23 15:47:12.948639 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:12.948599 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5gwzz/must-gather-d9wcs"] Apr 23 15:47:12.949119 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:12.948902 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b47a718-a9d0-4264-a3a0-4453dc2cadea" containerName="node" Apr 23 15:47:12.949119 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:12.948912 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b47a718-a9d0-4264-a3a0-4453dc2cadea" containerName="node" Apr 23 15:47:12.949119 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:12.948974 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b47a718-a9d0-4264-a3a0-4453dc2cadea" containerName="node" Apr 23 15:47:12.973312 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:12.973285 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5gwzz/must-gather-d9wcs"] Apr 23 15:47:12.973474 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:12.973398 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" Apr 23 15:47:12.975714 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:12.975679 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5gwzz\"/\"kube-root-ca.crt\"" Apr 23 15:47:12.975839 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:12.975737 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5gwzz\"/\"openshift-service-ca.crt\"" Apr 23 15:47:13.125010 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:13.124977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/579bd88d-83ad-4031-b94b-05dc245ecdc1-must-gather-output\") pod \"must-gather-d9wcs\" (UID: \"579bd88d-83ad-4031-b94b-05dc245ecdc1\") " pod="openshift-must-gather-5gwzz/must-gather-d9wcs" Apr 23 15:47:13.125176 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:13.125017 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4x7m\" (UniqueName: \"kubernetes.io/projected/579bd88d-83ad-4031-b94b-05dc245ecdc1-kube-api-access-m4x7m\") pod \"must-gather-d9wcs\" (UID: \"579bd88d-83ad-4031-b94b-05dc245ecdc1\") " pod="openshift-must-gather-5gwzz/must-gather-d9wcs" Apr 23 15:47:13.225563 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:13.225483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/579bd88d-83ad-4031-b94b-05dc245ecdc1-must-gather-output\") pod \"must-gather-d9wcs\" (UID: \"579bd88d-83ad-4031-b94b-05dc245ecdc1\") " pod="openshift-must-gather-5gwzz/must-gather-d9wcs" Apr 23 15:47:13.225563 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:13.225525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4x7m\" (UniqueName: \"kubernetes.io/projected/579bd88d-83ad-4031-b94b-05dc245ecdc1-kube-api-access-m4x7m\") pod \"must-gather-d9wcs\" (UID: \"579bd88d-83ad-4031-b94b-05dc245ecdc1\") " pod="openshift-must-gather-5gwzz/must-gather-d9wcs" Apr 23 15:47:13.225909 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:13.225887 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/579bd88d-83ad-4031-b94b-05dc245ecdc1-must-gather-output\") pod \"must-gather-d9wcs\" (UID: \"579bd88d-83ad-4031-b94b-05dc245ecdc1\") " pod="openshift-must-gather-5gwzz/must-gather-d9wcs" Apr 23 15:47:13.233660 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:13.233641 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4x7m\" (UniqueName: \"kubernetes.io/projected/579bd88d-83ad-4031-b94b-05dc245ecdc1-kube-api-access-m4x7m\") pod \"must-gather-d9wcs\" (UID: \"579bd88d-83ad-4031-b94b-05dc245ecdc1\") " pod="openshift-must-gather-5gwzz/must-gather-d9wcs" Apr 23 15:47:13.282745 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:13.282711 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" Apr 23 15:47:13.401430 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:13.401402 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5gwzz/must-gather-d9wcs"] Apr 23 15:47:13.403993 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:47:13.403961 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod579bd88d_83ad_4031_b94b_05dc245ecdc1.slice/crio-9cdc39083021907f63343955b6e3ea0d00e48acce68dbed6323933c0cc71919c WatchSource:0}: Error finding container 9cdc39083021907f63343955b6e3ea0d00e48acce68dbed6323933c0cc71919c: Status 404 returned error can't find the container with id 9cdc39083021907f63343955b6e3ea0d00e48acce68dbed6323933c0cc71919c Apr 23 15:47:13.405653 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:13.405636 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 15:47:13.559204 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:13.559120 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" event={"ID":"579bd88d-83ad-4031-b94b-05dc245ecdc1","Type":"ContainerStarted","Data":"9cdc39083021907f63343955b6e3ea0d00e48acce68dbed6323933c0cc71919c"} Apr 23 15:47:15.694370 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:15.694328 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds"] Apr 23 15:47:15.699180 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:15.699154 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-slrp6/test-trainjob-9vkfq-node-0-0-mm8ds"] Apr 23 15:47:15.795813 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:15.795776 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9"] Apr 23 15:47:15.799280 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:15.799255 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-4lkhz/test-trainjob-xp5x6-node-0-0-sg5c9"] Apr 23 15:47:15.994420 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:15.994337 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7"] Apr 23 15:47:15.996231 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:15.996204 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-ljvbf/test-trainjob-7d876-node-0-0-ts9r7"] Apr 23 15:47:16.094536 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:16.094504 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b47a718-a9d0-4264-a3a0-4453dc2cadea" path="/var/lib/kubelet/pods/2b47a718-a9d0-4264-a3a0-4453dc2cadea/volumes" Apr 23 15:47:16.094871 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:16.094857 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7" path="/var/lib/kubelet/pods/ac2dcd0a-5552-4f39-9a71-4f59a55e7ec7/volumes" Apr 23 15:47:16.095135 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:16.095124 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbe949fc-4b43-40fa-a70d-954967e42af1" path="/var/lib/kubelet/pods/bbe949fc-4b43-40fa-a70d-954967e42af1/volumes" Apr 23 15:47:16.098040 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:16.098015 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7"] Apr 23 15:47:16.106254 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:16.106219 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-gmxv6/test-trainjob-ncx8l-node-0-0-kc4z7"] Apr 23 15:47:16.752107 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:16.752070 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882"] Apr 23 15:47:16.754576 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:16.754549 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-8ccrr/test-trainjob-m8fxg-node-0-0-lp882"] Apr 23 15:47:18.095126 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:18.095085 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee4af63-5eb4-4309-bc38-0d1fd42a7702" path="/var/lib/kubelet/pods/0ee4af63-5eb4-4309-bc38-0d1fd42a7702/volumes" Apr 23 15:47:18.095536 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:18.095514 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="681ce542-0f81-4ec1-84de-2c8953d7dc00" path="/var/lib/kubelet/pods/681ce542-0f81-4ec1-84de-2c8953d7dc00/volumes" Apr 23 15:47:20.590042 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:20.589954 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" event={"ID":"579bd88d-83ad-4031-b94b-05dc245ecdc1","Type":"ContainerStarted","Data":"6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981"} Apr 23 15:47:21.595195 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:21.595160 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" event={"ID":"579bd88d-83ad-4031-b94b-05dc245ecdc1","Type":"ContainerStarted","Data":"273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e"} Apr 23 15:47:21.611929 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:21.611801 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" podStartSLOduration=2.6810276330000002 podStartE2EDuration="9.611779561s" podCreationTimestamp="2026-04-23 15:47:12 +0000 UTC" firstStartedPulling="2026-04-23 15:47:13.405805367 +0000 UTC m=+2993.912887731" lastFinishedPulling="2026-04-23 15:47:20.336557292 +0000 UTC m=+3000.843639659" observedRunningTime="2026-04-23 15:47:21.611761999 +0000 UTC m=+3002.118844380" watchObservedRunningTime="2026-04-23 15:47:21.611779561 +0000 UTC m=+3002.118861948" Apr 23 15:47:31.104987 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:31.104947 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-w24ld_fc613cfd-3e25-4570-a930-ebb12df77e8a/manager/0.log" Apr 23 15:47:31.632959 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:31.632922 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-w24ld_fc613cfd-3e25-4570-a930-ebb12df77e8a/manager/0.log" Apr 23 15:47:32.594609 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:47:32.594574 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-w24ld_fc613cfd-3e25-4570-a930-ebb12df77e8a/manager/0.log" Apr 23 15:48:12.785949 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:12.785909 2577 generic.go:358] "Generic (PLEG): container finished" podID="579bd88d-83ad-4031-b94b-05dc245ecdc1" containerID="6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981" exitCode=0 Apr 23 15:48:12.786385 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:12.785978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" event={"ID":"579bd88d-83ad-4031-b94b-05dc245ecdc1","Type":"ContainerDied","Data":"6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981"} Apr 23 15:48:12.786385 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:12.786256 2577 scope.go:117] "RemoveContainer" containerID="6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981" Apr 23 15:48:13.335766 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:13.335730 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5gwzz_must-gather-d9wcs_579bd88d-83ad-4031-b94b-05dc245ecdc1/gather/0.log" Apr 23 15:48:13.872270 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:13.872229 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-75czz/must-gather-8clf2"] Apr 23 15:48:13.875902 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:13.875868 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75czz/must-gather-8clf2" Apr 23 15:48:13.878735 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:13.878713 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-75czz\"/\"openshift-service-ca.crt\"" Apr 23 15:48:13.878858 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:13.878713 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-75czz\"/\"default-dockercfg-6s449\"" Apr 23 15:48:13.878858 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:13.878713 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-75czz\"/\"kube-root-ca.crt\"" Apr 23 15:48:13.883342 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:13.883316 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-75czz/must-gather-8clf2"] Apr 23 15:48:13.955032 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:13.954993 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj84d\" (UniqueName: \"kubernetes.io/projected/e479146c-7450-4d45-ae26-efe5062f02c7-kube-api-access-mj84d\") pod \"must-gather-8clf2\" (UID: \"e479146c-7450-4d45-ae26-efe5062f02c7\") " pod="openshift-must-gather-75czz/must-gather-8clf2" Apr 23 15:48:13.955200 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:13.955052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e479146c-7450-4d45-ae26-efe5062f02c7-must-gather-output\") pod \"must-gather-8clf2\" (UID: \"e479146c-7450-4d45-ae26-efe5062f02c7\") " pod="openshift-must-gather-75czz/must-gather-8clf2" Apr 23 15:48:14.056378 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:14.056339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj84d\" (UniqueName: \"kubernetes.io/projected/e479146c-7450-4d45-ae26-efe5062f02c7-kube-api-access-mj84d\") pod \"must-gather-8clf2\" (UID: \"e479146c-7450-4d45-ae26-efe5062f02c7\") " pod="openshift-must-gather-75czz/must-gather-8clf2" Apr 23 15:48:14.056544 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:14.056395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e479146c-7450-4d45-ae26-efe5062f02c7-must-gather-output\") pod \"must-gather-8clf2\" (UID: \"e479146c-7450-4d45-ae26-efe5062f02c7\") " pod="openshift-must-gather-75czz/must-gather-8clf2" Apr 23 15:48:14.056769 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:14.056752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e479146c-7450-4d45-ae26-efe5062f02c7-must-gather-output\") pod \"must-gather-8clf2\" (UID: \"e479146c-7450-4d45-ae26-efe5062f02c7\") " pod="openshift-must-gather-75czz/must-gather-8clf2" Apr 23 15:48:14.064962 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:14.064934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj84d\" (UniqueName: \"kubernetes.io/projected/e479146c-7450-4d45-ae26-efe5062f02c7-kube-api-access-mj84d\") pod \"must-gather-8clf2\" (UID: \"e479146c-7450-4d45-ae26-efe5062f02c7\") " pod="openshift-must-gather-75czz/must-gather-8clf2" Apr 23 15:48:14.185917 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:14.185887 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75czz/must-gather-8clf2" Apr 23 15:48:14.309625 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:14.309541 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-75czz/must-gather-8clf2"] Apr 23 15:48:14.311806 ip-10-0-141-16 kubenswrapper[2577]: W0423 15:48:14.311778 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode479146c_7450_4d45_ae26_efe5062f02c7.slice/crio-98f308932b1b44200c5e6687da6f49334e4f7928610530fe75358b120d52c095 WatchSource:0}: Error finding container 98f308932b1b44200c5e6687da6f49334e4f7928610530fe75358b120d52c095: Status 404 returned error can't find the container with id 98f308932b1b44200c5e6687da6f49334e4f7928610530fe75358b120d52c095 Apr 23 15:48:14.792807 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:14.792768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75czz/must-gather-8clf2" event={"ID":"e479146c-7450-4d45-ae26-efe5062f02c7","Type":"ContainerStarted","Data":"98f308932b1b44200c5e6687da6f49334e4f7928610530fe75358b120d52c095"} Apr 23 15:48:15.798940 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:15.798898 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75czz/must-gather-8clf2" event={"ID":"e479146c-7450-4d45-ae26-efe5062f02c7","Type":"ContainerStarted","Data":"fc3d7915ab71b9754349a04a39e79b8991052301b2160c8f2355d409695b8146"} Apr 23 15:48:15.799348 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:15.798947 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75czz/must-gather-8clf2" event={"ID":"e479146c-7450-4d45-ae26-efe5062f02c7","Type":"ContainerStarted","Data":"78d23c0013aed3af28bb312ef717d67dbbaca930da714d7daebf8c5033844859"} Apr 23 15:48:15.814833 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:15.814772 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-75czz/must-gather-8clf2" podStartSLOduration=2.044502011 podStartE2EDuration="2.814751179s" podCreationTimestamp="2026-04-23 15:48:13 +0000 UTC" firstStartedPulling="2026-04-23 15:48:14.313580068 +0000 UTC m=+3054.820662437" lastFinishedPulling="2026-04-23 15:48:15.083829239 +0000 UTC m=+3055.590911605" observedRunningTime="2026-04-23 15:48:15.813368774 +0000 UTC m=+3056.320451177" watchObservedRunningTime="2026-04-23 15:48:15.814751179 +0000 UTC m=+3056.321833568" Apr 23 15:48:16.665595 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:16.665559 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fpdpm_9bcef650-4335-4e0e-bbf1-0e9794d90e8c/global-pull-secret-syncer/0.log" Apr 23 15:48:16.767040 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:16.767006 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6ttdv_2c98eab9-7fb6-4067-92ac-e85bf7a1af4b/konnectivity-agent/0.log" Apr 23 15:48:16.897759 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:16.897725 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-16.ec2.internal_237efac7542ae805317afa8331e5e27b/haproxy/0.log" Apr 23 15:48:18.727030 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:18.726986 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5gwzz/must-gather-d9wcs"] Apr 23 15:48:18.727638 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:18.727267 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" containerName="copy" containerID="cri-o://273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e" gracePeriod=2 Apr 23 15:48:18.734956 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:18.734915 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5gwzz/must-gather-d9wcs"] Apr 23 15:48:19.108718 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.108482 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5gwzz_must-gather-d9wcs_579bd88d-83ad-4031-b94b-05dc245ecdc1/copy/0.log" Apr 23 15:48:19.108932 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.108900 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" Apr 23 15:48:19.115156 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.114301 2577 status_manager.go:895] "Failed to get status for pod" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" err="pods \"must-gather-d9wcs\" is forbidden: User \"system:node:ip-10-0-141-16.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5gwzz\": no relationship found between node 'ip-10-0-141-16.ec2.internal' and this object" Apr 23 15:48:19.214965 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.214925 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/579bd88d-83ad-4031-b94b-05dc245ecdc1-must-gather-output\") pod \"579bd88d-83ad-4031-b94b-05dc245ecdc1\" (UID: \"579bd88d-83ad-4031-b94b-05dc245ecdc1\") " Apr 23 15:48:19.215146 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.214981 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4x7m\" (UniqueName: \"kubernetes.io/projected/579bd88d-83ad-4031-b94b-05dc245ecdc1-kube-api-access-m4x7m\") pod \"579bd88d-83ad-4031-b94b-05dc245ecdc1\" (UID: \"579bd88d-83ad-4031-b94b-05dc245ecdc1\") " Apr 23 15:48:19.217429 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.217393 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/579bd88d-83ad-4031-b94b-05dc245ecdc1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "579bd88d-83ad-4031-b94b-05dc245ecdc1" (UID: "579bd88d-83ad-4031-b94b-05dc245ecdc1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 15:48:19.221213 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.221177 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579bd88d-83ad-4031-b94b-05dc245ecdc1-kube-api-access-m4x7m" (OuterVolumeSpecName: "kube-api-access-m4x7m") pod "579bd88d-83ad-4031-b94b-05dc245ecdc1" (UID: "579bd88d-83ad-4031-b94b-05dc245ecdc1"). InnerVolumeSpecName "kube-api-access-m4x7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 15:48:19.316933 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.316883 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/579bd88d-83ad-4031-b94b-05dc245ecdc1-must-gather-output\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:48:19.316933 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.316936 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m4x7m\" (UniqueName: \"kubernetes.io/projected/579bd88d-83ad-4031-b94b-05dc245ecdc1-kube-api-access-m4x7m\") on node \"ip-10-0-141-16.ec2.internal\" DevicePath \"\"" Apr 23 15:48:19.818144 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.818108 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5gwzz_must-gather-d9wcs_579bd88d-83ad-4031-b94b-05dc245ecdc1/copy/0.log" Apr 23 15:48:19.818751 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.818505 2577 generic.go:358] "Generic (PLEG): container finished" podID="579bd88d-83ad-4031-b94b-05dc245ecdc1" containerID="273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e" exitCode=143 Apr 23 15:48:19.818751 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.818633 2577 scope.go:117] "RemoveContainer" containerID="273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e" Apr 23 15:48:19.818865 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.818797 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" Apr 23 15:48:19.828007 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.827965 2577 status_manager.go:895] "Failed to get status for pod" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" err="pods \"must-gather-d9wcs\" is forbidden: User \"system:node:ip-10-0-141-16.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5gwzz\": no relationship found between node 'ip-10-0-141-16.ec2.internal' and this object" Apr 23 15:48:19.834103 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.834062 2577 status_manager.go:895] "Failed to get status for pod" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" err="pods \"must-gather-d9wcs\" is forbidden: User \"system:node:ip-10-0-141-16.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5gwzz\": no relationship found between node 'ip-10-0-141-16.ec2.internal' and this object" Apr 23 15:48:19.842511 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.840569 2577 scope.go:117] "RemoveContainer" containerID="6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981" Apr 23 15:48:19.861595 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.861556 2577 scope.go:117] "RemoveContainer" containerID="273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e" Apr 23 15:48:19.862151 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:48:19.862115 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e\": container with ID starting with 273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e not found: ID does not exist" containerID="273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e" Apr 23 15:48:19.862276 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.862162 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e"} err="failed to get container status \"273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e\": rpc error: code = NotFound desc = could not find container \"273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e\": container with ID starting with 273ec56d351fe4a36b9f3d5efe4908c492d96333755729c3fee60ab98931640e not found: ID does not exist" Apr 23 15:48:19.862276 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.862191 2577 scope.go:117] "RemoveContainer" containerID="6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981" Apr 23 15:48:19.862583 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:48:19.862554 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981\": container with ID starting with 6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981 not found: ID does not exist" containerID="6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981" Apr 23 15:48:19.862662 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:19.862591 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981"} err="failed to get container status \"6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981\": rpc error: code = NotFound desc = could not find container \"6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981\": container with ID starting with 6b27e36e142694f662ded65f540dbb61543cdf1cf8c016e1d7b17b9ffc580981 not found: ID does not exist" Apr 23 15:48:20.023976 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.023944 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a048f109-903d-48f3-8b93-bf6c8b811e53/alertmanager/0.log" Apr 23 15:48:20.051092 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.051059 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a048f109-903d-48f3-8b93-bf6c8b811e53/config-reloader/0.log" Apr 23 15:48:20.075960 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.075873 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a048f109-903d-48f3-8b93-bf6c8b811e53/kube-rbac-proxy-web/0.log" Apr 23 15:48:20.095540 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.095501 2577 status_manager.go:895] "Failed to get status for pod" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" pod="openshift-must-gather-5gwzz/must-gather-d9wcs" err="pods \"must-gather-d9wcs\" is forbidden: User \"system:node:ip-10-0-141-16.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5gwzz\": no relationship found between node 'ip-10-0-141-16.ec2.internal' and this object" Apr 23 15:48:20.099722 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.096582 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" path="/var/lib/kubelet/pods/579bd88d-83ad-4031-b94b-05dc245ecdc1/volumes" Apr 23 15:48:20.119020 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.118985 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a048f109-903d-48f3-8b93-bf6c8b811e53/kube-rbac-proxy/0.log" Apr 23 15:48:20.146583 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.146546 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a048f109-903d-48f3-8b93-bf6c8b811e53/kube-rbac-proxy-metric/0.log" Apr 23 15:48:20.172748 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.172714 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a048f109-903d-48f3-8b93-bf6c8b811e53/prom-label-proxy/0.log" Apr 23 15:48:20.205712 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.205661 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a048f109-903d-48f3-8b93-bf6c8b811e53/init-config-reloader/0.log" Apr 23 15:48:20.367948 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.367869 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5c59f89644-9dzhs_97d7f10c-9cd4-4ae9-9636-cb7586b6f490/metrics-server/0.log" Apr 23 15:48:20.396080 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.396049 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-87wcw_3fb6691a-6e97-45ad-a1d2-d761e8d9f27f/monitoring-plugin/0.log" Apr 23 15:48:20.509418 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.509383 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-t7hbb_94819bd0-3e06-4ceb-94d5-520933061da5/node-exporter/0.log" Apr 23 15:48:20.528624 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.528593 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-t7hbb_94819bd0-3e06-4ceb-94d5-520933061da5/kube-rbac-proxy/0.log" Apr 23 15:48:20.550539 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.550510 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-t7hbb_94819bd0-3e06-4ceb-94d5-520933061da5/init-textfile/0.log" Apr 23 15:48:20.656063 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.656031 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kh5pf_43c1c226-867a-4730-91bb-e0380cebaf6e/kube-rbac-proxy-main/0.log" Apr 23 15:48:20.680133 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.680099 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kh5pf_43c1c226-867a-4730-91bb-e0380cebaf6e/kube-rbac-proxy-self/0.log" Apr 23 15:48:20.707068 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:20.707038 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kh5pf_43c1c226-867a-4730-91bb-e0380cebaf6e/openshift-state-metrics/0.log" Apr 23 15:48:21.017350 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:21.017269 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-855f55c85c-km4hp_811c13a0-c536-4b9f-85e7-27c020082c98/telemeter-client/0.log" Apr 23 15:48:21.050860 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:21.050827 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-855f55c85c-km4hp_811c13a0-c536-4b9f-85e7-27c020082c98/reload/0.log" Apr 23 15:48:21.081496 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:21.081466 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-855f55c85c-km4hp_811c13a0-c536-4b9f-85e7-27c020082c98/kube-rbac-proxy/0.log" Apr 23 15:48:23.570053 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.569998 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj"] Apr 23 15:48:23.571085 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.571057 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" containerName="gather" Apr 23 15:48:23.571225 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.571213 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" containerName="gather" Apr 23 15:48:23.571333 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.571323 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" containerName="copy" Apr 23 15:48:23.571406 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.571395 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" containerName="copy" Apr 23 15:48:23.571581 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.571567 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" containerName="copy" Apr 23 15:48:23.571694 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.571674 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="579bd88d-83ad-4031-b94b-05dc245ecdc1" containerName="gather" Apr 23 15:48:23.576303 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.576283 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.581508 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.581482 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj"] Apr 23 15:48:23.656829 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.656798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-sys\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.656829 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.656834 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-podres\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.657079 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.656864 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-proc\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.657079 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.656967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5v9g\" (UniqueName: \"kubernetes.io/projected/68b79f08-3502-47c1-8c9e-20e75f6539f3-kube-api-access-f5v9g\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.657079 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.657035 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-lib-modules\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.758339 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.758297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-sys\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.758339 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.758339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-podres\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.758593 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.758369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-proc\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.758593 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.758409 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5v9g\" (UniqueName: \"kubernetes.io/projected/68b79f08-3502-47c1-8c9e-20e75f6539f3-kube-api-access-f5v9g\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.758593 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.758439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-sys\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.758593 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.758446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-lib-modules\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.758593 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.758500 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-podres\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.758593 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.758513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-proc\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.758593 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.758522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68b79f08-3502-47c1-8c9e-20e75f6539f3-lib-modules\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.767233 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.767213 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5v9g\" (UniqueName: \"kubernetes.io/projected/68b79f08-3502-47c1-8c9e-20e75f6539f3-kube-api-access-f5v9g\") pod \"perf-node-gather-daemonset-gqmcj\" (UID: \"68b79f08-3502-47c1-8c9e-20e75f6539f3\") " pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:23.890088 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:23.888432 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:24.031745 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:24.026573 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj"] Apr 23 15:48:24.334176 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:24.334144 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-d9zxh_b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea/dns/0.log" Apr 23 15:48:24.356516 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:24.356483 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-d9zxh_b24ca5ef-2f02-4ddb-9e88-c2eae3af67ea/kube-rbac-proxy/0.log" Apr 23 15:48:24.468299 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:24.468222 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2hv6m_1399ad28-3a6d-4e8a-9154-bb0eafc7e101/dns-node-resolver/0.log" Apr 23 15:48:24.839222 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:24.839124 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" event={"ID":"68b79f08-3502-47c1-8c9e-20e75f6539f3","Type":"ContainerStarted","Data":"eac0ff0249a71f28a5eaf6300bcc29345c0723bfbe0b4d465f2eacd63fd5bac4"} Apr 23 15:48:24.839222 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:24.839179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" event={"ID":"68b79f08-3502-47c1-8c9e-20e75f6539f3","Type":"ContainerStarted","Data":"c900d2d0c1cf3aae99f0f97b706af25b11c7983466e94137a7ae98b49c48b649"} Apr 23 15:48:24.840022 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:24.839991 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:24.857582 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:24.857537 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" podStartSLOduration=1.857521546 podStartE2EDuration="1.857521546s" podCreationTimestamp="2026-04-23 15:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 15:48:24.856649315 +0000 UTC m=+3065.363731701" watchObservedRunningTime="2026-04-23 15:48:24.857521546 +0000 UTC m=+3065.364603932" Apr 23 15:48:24.955222 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:24.955189 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-744fbcdbfd-v2rps_432ca574-ef9a-4480-b470-c447442aa39a/registry/0.log" Apr 23 15:48:25.038758 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:25.038719 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tg5n4_b32c0a28-43de-408e-beab-64a0a9ff6ac3/node-ca/0.log" Apr 23 15:48:26.182143 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:26.182108 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jvjqc_938f7023-4fea-4435-b684-b2a0a3193583/serve-healthcheck-canary/0.log" Apr 23 15:48:26.543577 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:26.543493 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5fx2c_329f6021-c417-448e-80e7-c55ea27c6445/kube-rbac-proxy/0.log" Apr 23 15:48:26.566736 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:26.566688 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5fx2c_329f6021-c417-448e-80e7-c55ea27c6445/exporter/0.log" Apr 23 15:48:26.589879 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:26.589849 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5fx2c_329f6021-c417-448e-80e7-c55ea27c6445/extractor/0.log" Apr 23 15:48:28.382169 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:28.382135 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-4hk2f_a8ac0d8b-3d01-43af-a664-507e367c0b0e/jobset-operator/0.log" Apr 23 15:48:31.859521 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:31.859482 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-75czz/perf-node-gather-daemonset-gqmcj" Apr 23 15:48:32.858264 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:32.858235 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jhxf7_82866f79-d634-4792-b3fd-ef2753feb90f/kube-multus-additional-cni-plugins/0.log" Apr 23 15:48:32.885367 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:32.885334 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jhxf7_82866f79-d634-4792-b3fd-ef2753feb90f/egress-router-binary-copy/0.log" Apr 23 15:48:32.909693 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:32.909659 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jhxf7_82866f79-d634-4792-b3fd-ef2753feb90f/cni-plugins/0.log" Apr 23 15:48:32.933739 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:32.933690 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jhxf7_82866f79-d634-4792-b3fd-ef2753feb90f/bond-cni-plugin/0.log" Apr 23 15:48:32.956837 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:32.956804 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jhxf7_82866f79-d634-4792-b3fd-ef2753feb90f/routeoverride-cni/0.log" Apr 23 15:48:32.984048 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:32.984008 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jhxf7_82866f79-d634-4792-b3fd-ef2753feb90f/whereabouts-cni-bincopy/0.log" Apr 23 15:48:33.032415 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:33.032386 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jhxf7_82866f79-d634-4792-b3fd-ef2753feb90f/whereabouts-cni/0.log" Apr 23 15:48:33.257484 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:33.257458 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w69vk_018b5460-f98e-40c3-a50c-c43ca05fa0ef/kube-multus/0.log" Apr 23 15:48:33.282407 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:33.282380 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bjpzv_be6b2313-a857-46d8-8b0d-adbd4a48cb9d/network-metrics-daemon/0.log" Apr 23 15:48:33.304153 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:33.304117 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bjpzv_be6b2313-a857-46d8-8b0d-adbd4a48cb9d/kube-rbac-proxy/0.log" Apr 23 15:48:34.612760 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:34.612662 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-controller/0.log" Apr 23 15:48:34.630375 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:34.630329 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/0.log" Apr 23 15:48:34.652390 ip-10-0-141-16 kubenswrapper[2577]: E0423 15:48:34.652356 2577 log.go:32] "Failed when parsing line in log file" err="unexpected stream type \"stder2026-04-23T15:37:47.016482205+00:00\"" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/1.log" line="2026-04-23T15:37:17.013035414+00:00 stder2026-04-23T15:37:47.016482205+00:00 stderr F + true\n" Apr 23 15:48:34.658395 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:34.658358 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovn-acl-logging/1.log" Apr 23 15:48:34.680193 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:34.680148 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/kube-rbac-proxy-node/0.log" Apr 23 15:48:34.701672 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:34.701648 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 15:48:34.721765 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:34.721740 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/northd/0.log" Apr 23 15:48:34.744110 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:34.744085 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/nbdb/0.log" Apr 23 15:48:34.768998 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:34.768970 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/sbdb/0.log" Apr 23 15:48:34.929734 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:34.929652 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpxb2_43741ca5-8515-4e77-be0b-52d3495c2460/ovnkube-controller/0.log" Apr 23 15:48:36.249753 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:36.249721 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6rls8_3236e428-70b1-4400-9f33-348489a945df/network-check-target-container/0.log" Apr 23 15:48:37.284165 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:37.284131 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-k777l_aa2c92c0-87e9-40a1-8ef8-7cb7d5e860d4/iptables-alerter/0.log" Apr 23 15:48:38.063140 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:38.063105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nhrx7_e4d459d1-0077-45ce-b190-9cc485dd9ae5/tuned/0.log" Apr 23 15:48:41.668086 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:41.668052 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-kq7jc_591b1876-3766-4f35-9533-34a2ea6f684e/csi-driver/0.log" Apr 23 15:48:41.691097 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:41.691064 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-kq7jc_591b1876-3766-4f35-9533-34a2ea6f684e/csi-node-driver-registrar/0.log" Apr 23 15:48:41.711848 ip-10-0-141-16 kubenswrapper[2577]: I0423 15:48:41.711806 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-kq7jc_591b1876-3766-4f35-9533-34a2ea6f684e/csi-liveness-probe/0.log"