Apr 23 08:50:54.924167 ip-10-0-141-250 systemd[1]: Starting Kubernetes Kubelet... Apr 23 08:50:55.303935 ip-10-0-141-250 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:50:55.303935 ip-10-0-141-250 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 08:50:55.303935 ip-10-0-141-250 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:50:55.303935 ip-10-0-141-250 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 08:50:55.303935 ip-10-0-141-250 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 08:50:55.305859 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.305768 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 08:50:55.311232 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311209 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:55.311232 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311227 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:55.311232 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311231 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:55.311232 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311234 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:55.311232 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311237 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:55.311232 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311240 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311244 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311247 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311250 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311252 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311255 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311258 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311260 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311263 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311265 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311268 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311270 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311273 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311276 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311278 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311281 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311283 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311286 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311292 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:55.311450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311295 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311298 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311300 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311303 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311306 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311308 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311311 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311314 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311318 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311323 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311326 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311328 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311332 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311335 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311337 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311340 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311344 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311346 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311349 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311352 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:55.311917 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311354 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311359 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311361 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311364 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311366 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311369 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311371 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311374 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311376 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311379 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311381 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311385 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311387 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311390 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311393 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311395 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311398 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311400 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311403 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:55.312405 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311406 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311408 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311411 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311413 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311415 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311420 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311423 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311426 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311429 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311432 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311434 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311437 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311440 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311443 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311447 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311457 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311461 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311463 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311466 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311469 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:55.312850 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311471 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311474 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311476 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311860 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311865 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311868 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311871 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311873 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311876 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311878 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311881 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311883 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311886 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311888 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311891 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311907 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311910 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311913 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311915 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311919 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:55.313326 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311923 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311926 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311929 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311932 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311935 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311938 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311941 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311944 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311946 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311949 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311951 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311957 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311960 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311962 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311965 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311967 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311970 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311972 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311975 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311977 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:55.313794 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311979 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311982 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311984 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311987 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311989 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311993 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311996 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.311999 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312001 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312004 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312006 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312009 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312012 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312015 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312017 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312020 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312022 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312025 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312028 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:55.314389 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312031 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312034 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312037 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312039 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312042 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312044 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312047 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312049 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312051 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312054 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312058 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312061 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312065 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312068 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312070 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312073 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312076 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312079 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312081 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:55.314882 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312084 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312087 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312089 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312094 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312096 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312099 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312102 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312104 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312107 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312110 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312112 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312192 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312199 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312208 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312212 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312217 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312220 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312225 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312229 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312233 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312236 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 08:50:55.315366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312239 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312242 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312245 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312248 2575 flags.go:64] FLAG: --cgroup-root="" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312251 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312254 2575 flags.go:64] FLAG: --client-ca-file="" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312257 2575 flags.go:64] FLAG: --cloud-config="" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312259 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312262 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312267 2575 flags.go:64] FLAG: --cluster-domain="" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312269 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312272 2575 flags.go:64] FLAG: --config-dir="" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312275 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312278 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312282 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312285 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312288 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312291 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312294 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312297 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312300 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312303 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312306 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312310 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312313 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 08:50:55.315871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312316 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312319 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312322 2575 flags.go:64] FLAG: --enable-server="true" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312325 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312329 2575 flags.go:64] FLAG: --event-burst="100" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312333 2575 flags.go:64] FLAG: --event-qps="50" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312336 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312339 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312341 2575 flags.go:64] FLAG: --eviction-hard="" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312345 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312348 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312351 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312354 2575 flags.go:64] FLAG: --eviction-soft="" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312357 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312360 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312362 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312365 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312368 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312371 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312373 2575 flags.go:64] FLAG: --feature-gates="" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312377 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312380 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312383 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312386 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312389 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 23 08:50:55.316504 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312392 2575 flags.go:64] FLAG: --help="false" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312395 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312398 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312401 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312404 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312407 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312411 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312414 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312417 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312419 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312423 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312426 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312430 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312433 2575 flags.go:64] FLAG: --kube-reserved="" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312436 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312439 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312442 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312445 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312448 2575 flags.go:64] FLAG: --lock-file="" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312450 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312453 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312456 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312461 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 08:50:55.317108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312464 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312467 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312469 2575 flags.go:64] FLAG: --logging-format="text" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312472 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312475 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312478 2575 flags.go:64] FLAG: --manifest-url="" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312481 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312485 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312488 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312492 2575 flags.go:64] FLAG: --max-pods="110" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312495 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312498 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312501 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312504 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312507 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312510 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312512 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312520 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312522 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312525 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312532 2575 flags.go:64] FLAG: --pod-cidr="" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312537 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312542 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312545 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 08:50:55.317682 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312548 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312551 2575 flags.go:64] FLAG: --port="10250" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312554 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312557 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06ac94e85f337c426" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312560 2575 flags.go:64] FLAG: --qos-reserved="" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312562 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312565 2575 flags.go:64] FLAG: --register-node="true" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312568 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312571 2575 flags.go:64] FLAG: --register-with-taints="" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312574 2575 flags.go:64] FLAG: --registry-burst="10" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312577 2575 flags.go:64] FLAG: --registry-qps="5" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312580 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312582 2575 flags.go:64] FLAG: --reserved-memory="" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312586 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312589 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312592 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312594 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312597 2575 flags.go:64] FLAG: --runonce="false" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312600 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312604 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312606 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312609 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312612 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312615 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312618 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312621 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 08:50:55.318275 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312623 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312626 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312629 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312633 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312636 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312639 2575 flags.go:64] FLAG: --system-cgroups="" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312642 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312648 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312651 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312654 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312658 2575 flags.go:64] FLAG: --tls-min-version="" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312661 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312664 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312667 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312670 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312673 2575 flags.go:64] FLAG: --v="2" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312677 2575 flags.go:64] FLAG: --version="false" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312681 2575 flags.go:64] FLAG: --vmodule="" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312685 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.312688 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312770 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312777 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312780 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312783 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:55.319093 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312786 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312788 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312791 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312793 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312796 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312798 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312801 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312803 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312806 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312808 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312811 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312814 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312818 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312820 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312823 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312825 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312828 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312830 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312833 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:55.320098 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312835 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312838 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312840 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312843 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312845 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312848 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312850 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312853 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312856 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312858 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312861 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312864 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312866 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312869 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312872 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312874 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312877 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312879 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312882 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312884 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:55.320890 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312886 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312889 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312891 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312906 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312910 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312913 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312916 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312918 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312921 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312923 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312926 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312928 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312931 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312933 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312936 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312938 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312941 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312943 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312946 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312948 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:55.321801 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312951 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312953 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312956 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312959 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312961 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312964 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312966 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312969 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312973 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312977 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312980 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312983 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312986 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312988 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312991 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312994 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.312998 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.313000 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.313003 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:55.322368 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.313006 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.313008 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.313011 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.313014 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.313634 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.320738 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.320761 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320831 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320839 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320844 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320849 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320855 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320859 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320864 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320868 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320872 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:55.322971 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320877 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320882 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320886 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320890 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320911 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320915 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320920 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320924 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320928 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320931 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320935 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320939 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320944 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320948 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320952 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320957 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320962 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320967 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320971 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320975 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:55.323481 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320979 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320983 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320988 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320993 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.320997 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321001 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321005 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321009 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321013 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321017 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321022 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321026 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321030 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321034 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321038 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321042 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321046 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321050 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321054 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321058 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:55.324293 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321062 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321066 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321071 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321079 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321084 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321089 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321094 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321098 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321102 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321109 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321115 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321120 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321125 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321130 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321135 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321140 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321145 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321149 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321153 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:55.324952 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321157 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321161 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321166 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321170 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321174 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321178 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321182 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321187 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321191 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321195 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321199 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321203 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321208 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321212 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321216 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321220 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321224 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:55.325504 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321228 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.321236 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321388 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321395 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321400 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321405 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321411 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321415 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321420 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321424 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321429 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321433 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321440 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321470 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321477 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 08:50:55.325975 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321482 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321487 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321491 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321496 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321501 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321506 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321510 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321514 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321518 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321523 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321527 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321531 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321535 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321539 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321543 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321547 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321551 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321555 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321559 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 08:50:55.326372 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321563 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321567 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321571 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321575 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321579 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321584 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321589 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321593 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321597 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321601 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321605 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321609 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321613 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321617 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321621 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321625 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321629 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321633 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321637 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321641 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 08:50:55.326875 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321646 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321650 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321654 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321658 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321662 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321666 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321670 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321673 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321677 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321682 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321686 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321690 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321694 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321699 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321703 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321707 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321711 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321715 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321720 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 08:50:55.327419 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321724 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321729 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321733 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321737 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321741 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321745 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321749 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321753 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321757 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321761 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321765 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321769 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321774 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321780 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:55.321785 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.321793 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 08:50:55.327918 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.322572 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 08:50:55.328340 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.325326 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 08:50:55.328340 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.326222 2575 server.go:1019] "Starting client certificate rotation" Apr 23 08:50:55.328340 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.326320 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:50:55.328340 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.326361 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 08:50:55.346992 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.346964 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:50:55.349414 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.349391 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 08:50:55.366598 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.366575 2575 log.go:25] "Validated CRI v1 runtime API" Apr 23 08:50:55.372476 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.372455 2575 log.go:25] "Validated CRI v1 image API" Apr 23 08:50:55.373815 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.373796 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 08:50:55.373893 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.373835 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:50:55.377658 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.377634 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c46ee820-3b14-4605-8f47-d45e88b0c329:/dev/nvme0n1p4 f454c040-162e-4891-a6e4-92b3898eaabb:/dev/nvme0n1p3] Apr 23 08:50:55.377740 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.377656 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 08:50:55.383105 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.382990 2575 manager.go:217] Machine: {Timestamp:2026-04-23 08:50:55.381297188 +0000 UTC m=+0.356604805 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3267369 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20fd8ebc633efb19c1575ca221fc7c SystemUUID:ec20fd8e-bc63-3efb-19c1-575ca221fc7c BootID:dd291409-c45b-47ca-8069-f2e805d7aab3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:85:51:7a:2f:3f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:85:51:7a:2f:3f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d2:b6:1b:ad:96:76 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 08:50:55.383105 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.383094 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 08:50:55.383243 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.383203 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 08:50:55.384207 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.384179 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 08:50:55.384370 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.384208 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-250.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 08:50:55.384455 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.384383 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 08:50:55.384455 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.384395 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 08:50:55.384455 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.384412 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:50:55.385743 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.385730 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 08:50:55.387401 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.387388 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:50:55.387688 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.387675 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 08:50:55.389879 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.389867 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 23 08:50:55.389956 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.389890 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 08:50:55.389956 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.389924 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 08:50:55.389956 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.389954 2575 kubelet.go:397] "Adding apiserver pod source" Apr 23 08:50:55.390078 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.389977 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 08:50:55.390991 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.390979 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:50:55.391063 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.391000 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 08:50:55.393502 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.393486 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 08:50:55.395068 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.395046 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 08:50:55.395506 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.395489 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-d69bk" Apr 23 08:50:55.396607 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396592 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 08:50:55.396681 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396615 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 08:50:55.396681 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396624 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 08:50:55.396681 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396635 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 08:50:55.396681 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396644 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 08:50:55.396681 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396653 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 08:50:55.396681 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396662 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 08:50:55.396681 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396671 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 08:50:55.396681 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396681 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 08:50:55.396936 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396690 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 08:50:55.396936 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396703 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 08:50:55.396936 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.396716 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 08:50:55.398195 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.398184 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 08:50:55.398246 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.398199 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 08:50:55.401756 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.401742 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 08:50:55.401840 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.401784 2575 server.go:1295] "Started kubelet" Apr 23 08:50:55.401912 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.401859 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 08:50:55.401955 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.401877 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 08:50:55.402007 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.401960 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 08:50:55.402523 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.402500 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-250.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 08:50:55.402613 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.402537 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-250.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 08:50:55.402680 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.402645 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 08:50:55.402670 ip-10-0-141-250 systemd[1]: Started Kubernetes Kubelet. Apr 23 08:50:55.403342 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.403324 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 08:50:55.403662 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.403478 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-d69bk" Apr 23 08:50:55.404109 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.404095 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 23 08:50:55.411582 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.411540 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 08:50:55.414479 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.414460 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 08:50:55.414955 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.414939 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 08:50:55.415469 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.415451 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 08:50:55.415469 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.415472 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 08:50:55.415606 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.415504 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 08:50:55.415606 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.415572 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 23 08:50:55.415606 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.415583 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 23 08:50:55.415939 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.415918 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:55.416084 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.416043 2575 factory.go:55] Registering systemd factory Apr 23 08:50:55.416220 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.416208 2575 factory.go:223] Registration of the systemd container factory successfully Apr 23 08:50:55.416932 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.416888 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:55.417589 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.417570 2575 factory.go:153] Registering CRI-O factory Apr 23 08:50:55.417589 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.417589 2575 factory.go:223] Registration of the crio container factory successfully Apr 23 08:50:55.417714 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.417640 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 08:50:55.417714 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.417675 2575 factory.go:103] Registering Raw factory Apr 23 08:50:55.417714 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.417689 2575 manager.go:1196] Started watching for new ooms in manager Apr 23 08:50:55.418142 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.418128 2575 manager.go:319] Starting recovery of all containers Apr 23 08:50:55.419054 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.419027 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-250.ec2.internal\" not found" node="ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.428280 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.428114 2575 manager.go:324] Recovery completed Apr 23 08:50:55.432164 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.432151 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:55.434690 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.434674 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:55.434760 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.434703 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:55.434760 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.434715 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:55.435185 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.435169 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 08:50:55.435270 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.435184 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 08:50:55.435270 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.435203 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 23 08:50:55.438013 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.438001 2575 policy_none.go:49] "None policy: Start" Apr 23 08:50:55.438057 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.438019 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 08:50:55.438057 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.438028 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 23 08:50:55.472458 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.472445 2575 manager.go:341] "Starting Device Plugin manager" Apr 23 08:50:55.479656 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.472472 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 08:50:55.479656 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.472482 2575 server.go:85] "Starting device plugin registration server" Apr 23 08:50:55.479656 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.472729 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 08:50:55.479656 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.472748 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 08:50:55.479656 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.472850 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 08:50:55.479656 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.472967 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 08:50:55.479656 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.472976 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 08:50:55.479656 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.473483 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 08:50:55.479656 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.473515 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:55.537998 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.537971 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 08:50:55.539180 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.539163 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 08:50:55.539296 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.539194 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 08:50:55.539296 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.539215 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 08:50:55.539296 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.539224 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 08:50:55.539296 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.539262 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 08:50:55.541927 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.541890 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:55.573546 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.573496 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:55.574911 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.574876 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:55.574986 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.574925 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:55.574986 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.574937 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:55.574986 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.574961 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.583825 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.583410 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.583825 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.583457 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-250.ec2.internal\": node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:55.606191 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.606171 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:55.639655 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.639607 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-250.ec2.internal"] Apr 23 08:50:55.639747 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.639711 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:55.640550 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.640537 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:55.640627 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.640568 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:55.640627 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.640582 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:55.641795 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.641780 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:55.641961 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.641945 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.642015 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.641974 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:55.642471 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.642453 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:55.642557 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.642482 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:55.642557 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.642492 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:55.642557 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.642453 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:55.642557 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.642544 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:55.642672 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.642563 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:55.643521 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.643506 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.643597 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.643533 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 08:50:55.644211 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.644193 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientMemory" Apr 23 08:50:55.644284 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.644226 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 08:50:55.644284 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.644241 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeHasSufficientPID" Apr 23 08:50:55.674499 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.674479 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-250.ec2.internal\" not found" node="ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.677805 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.677790 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-250.ec2.internal\" not found" node="ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.706508 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.706488 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:55.717206 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.717184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4a9276482c5142a05a682b295ad6ba37-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal\" (UID: \"4a9276482c5142a05a682b295ad6ba37\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.807298 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.807264 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:55.817629 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.817604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4a9276482c5142a05a682b295ad6ba37-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal\" (UID: \"4a9276482c5142a05a682b295ad6ba37\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.817748 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.817645 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a9276482c5142a05a682b295ad6ba37-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal\" (UID: \"4a9276482c5142a05a682b295ad6ba37\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.817748 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.817663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4720a5d3bfeaf0855890af875d341e36-config\") pod \"kube-apiserver-proxy-ip-10-0-141-250.ec2.internal\" (UID: \"4720a5d3bfeaf0855890af875d341e36\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.817748 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.817612 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4a9276482c5142a05a682b295ad6ba37-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal\" (UID: \"4a9276482c5142a05a682b295ad6ba37\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.908057 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:55.907978 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:55.918323 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.918301 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a9276482c5142a05a682b295ad6ba37-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal\" (UID: \"4a9276482c5142a05a682b295ad6ba37\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.918387 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.918330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4720a5d3bfeaf0855890af875d341e36-config\") pod \"kube-apiserver-proxy-ip-10-0-141-250.ec2.internal\" (UID: \"4720a5d3bfeaf0855890af875d341e36\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.918387 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.918370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4720a5d3bfeaf0855890af875d341e36-config\") pod \"kube-apiserver-proxy-ip-10-0-141-250.ec2.internal\" (UID: \"4720a5d3bfeaf0855890af875d341e36\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.918448 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.918400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a9276482c5142a05a682b295ad6ba37-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal\" (UID: \"4a9276482c5142a05a682b295ad6ba37\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.976478 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.976443 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" Apr 23 08:50:55.981318 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:55.981298 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-250.ec2.internal" Apr 23 08:50:56.008221 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:56.008195 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:56.108783 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:56.108755 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:56.209282 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:56.209218 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:56.309760 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:56.309733 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:56.326156 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.326136 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 08:50:56.326283 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.326265 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:50:56.326321 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.326298 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 08:50:56.407943 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.407891 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 08:45:55 +0000 UTC" deadline="2028-01-31 16:46:58.414358368 +0000 UTC" Apr 23 08:50:56.407943 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.407932 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15559h56m2.006429076s" Apr 23 08:50:56.410431 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:56.410416 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:56.414753 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.414729 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 08:50:56.428100 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.428079 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 08:50:56.443971 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.443940 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-s9j67" Apr 23 08:50:56.451555 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.451540 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-s9j67" Apr 23 08:50:56.483614 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:56.483564 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a9276482c5142a05a682b295ad6ba37.slice/crio-2323cfd59ca84c0990b300d1764fcf9c366d1c2c325377405f0008e63af5224a WatchSource:0}: Error finding container 2323cfd59ca84c0990b300d1764fcf9c366d1c2c325377405f0008e63af5224a: Status 404 returned error can't find the container with id 2323cfd59ca84c0990b300d1764fcf9c366d1c2c325377405f0008e63af5224a Apr 23 08:50:56.484069 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:56.484050 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4720a5d3bfeaf0855890af875d341e36.slice/crio-cdcb81cb4a5fbf4d7f35cce22c972a3f3f4736bb11a443073e265a173ae4bd9b WatchSource:0}: Error finding container cdcb81cb4a5fbf4d7f35cce22c972a3f3f4736bb11a443073e265a173ae4bd9b: Status 404 returned error can't find the container with id cdcb81cb4a5fbf4d7f35cce22c972a3f3f4736bb11a443073e265a173ae4bd9b Apr 23 08:50:56.488022 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.488009 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:50:56.511507 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:56.511485 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-250.ec2.internal\" not found" Apr 23 08:50:56.541569 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.541527 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" event={"ID":"4a9276482c5142a05a682b295ad6ba37","Type":"ContainerStarted","Data":"2323cfd59ca84c0990b300d1764fcf9c366d1c2c325377405f0008e63af5224a"} Apr 23 08:50:56.542110 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.542091 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:56.542449 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.542429 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-250.ec2.internal" event={"ID":"4720a5d3bfeaf0855890af875d341e36","Type":"ContainerStarted","Data":"cdcb81cb4a5fbf4d7f35cce22c972a3f3f4736bb11a443073e265a173ae4bd9b"} Apr 23 08:50:56.616218 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.616195 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" Apr 23 08:50:56.627616 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.627598 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:50:56.629040 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.629028 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-250.ec2.internal" Apr 23 08:50:56.637465 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.637452 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 08:50:56.660153 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:56.660130 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:57.176009 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.175980 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:57.260862 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.260830 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 08:50:57.391220 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.391192 2575 apiserver.go:52] "Watching apiserver" Apr 23 08:50:57.398036 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.398015 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 08:50:57.400060 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.400036 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-250.ec2.internal","openshift-dns/node-resolver-5lf2l","openshift-network-diagnostics/network-check-target-kl2k9","openshift-network-operator/iptables-alerter-kk6q7","openshift-ovn-kubernetes/ovnkube-node-jtcqz","kube-system/konnectivity-agent-2bjvt","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq","openshift-cluster-node-tuning-operator/tuned-tm68n","openshift-image-registry/node-ca-56m8r","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal","openshift-multus/multus-additional-cni-plugins-c6cqt","openshift-multus/multus-rt965","openshift-multus/network-metrics-daemon-9tmnv"] Apr 23 08:50:57.402511 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.402488 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.405172 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.405116 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j4qxb\"" Apr 23 08:50:57.405290 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.405179 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 08:50:57.405290 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.405184 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 08:50:57.405404 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.405300 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 08:50:57.407680 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.407641 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5lf2l" Apr 23 08:50:57.407952 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.407834 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:50:57.407952 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:57.407924 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:50:57.409744 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.409719 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-fjfk4\"" Apr 23 08:50:57.410149 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.409978 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 08:50:57.410149 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.410040 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 08:50:57.411076 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.411006 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kk6q7" Apr 23 08:50:57.413291 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.413240 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:57.413391 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.413344 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.413505 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.413487 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 08:50:57.413612 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.413527 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 08:50:57.413716 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.413700 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mphhw\"" Apr 23 08:50:57.415554 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.415533 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:50:57.415737 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.415715 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 08:50:57.415813 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.415734 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-d7sbx\"" Apr 23 08:50:57.415933 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.415914 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 08:50:57.416049 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.416033 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 08:50:57.416168 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.416152 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 08:50:57.416944 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.416928 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 08:50:57.417029 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.416970 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 08:50:57.417512 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.417494 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 08:50:57.417928 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.417911 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.417928 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.417924 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qvdn9\"" Apr 23 08:50:57.418072 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.417950 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 08:50:57.420144 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.420117 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5fj2c\"" Apr 23 08:50:57.420144 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.420137 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:50:57.420280 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.420171 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 08:50:57.422473 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.422451 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-56m8r" Apr 23 08:50:57.422600 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.422582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.424659 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.424593 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 08:50:57.424756 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.424728 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 08:50:57.425015 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.424956 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 08:50:57.425015 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.424983 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 08:50:57.425153 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.425015 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 08:50:57.425346 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.425322 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2snlq\"" Apr 23 08:50:57.425737 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.425719 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-lghp9\"" Apr 23 08:50:57.425737 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.425730 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 08:50:57.425878 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.425832 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 08:50:57.425878 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.425859 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 08:50:57.427073 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427021 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-modprobe-d\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.427073 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427053 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-host\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.427073 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427077 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-sys-fs\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.427279 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-run-ovn\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.427279 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-cni-bin\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.427279 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:50:57.427279 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427180 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rt965" Apr 23 08:50:57.427279 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:57.427230 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:50:57.427279 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-sys\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.427279 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4e667463-9112-48df-b2c9-8ff9e9415bce-tmp-dir\") pod \"node-resolver-5lf2l\" (UID: \"4e667463-9112-48df-b2c9-8ff9e9415bce\") " pod="openshift-dns/node-resolver-5lf2l" Apr 23 08:50:57.427588 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb8pg\" (UniqueName: \"kubernetes.io/projected/4e667463-9112-48df-b2c9-8ff9e9415bce-kube-api-access-cb8pg\") pod \"node-resolver-5lf2l\" (UID: \"4e667463-9112-48df-b2c9-8ff9e9415bce\") " pod="openshift-dns/node-resolver-5lf2l" Apr 23 08:50:57.427588 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427369 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-log-socket\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428010 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.427989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-cni-netd\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428097 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-sysctl-d\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.428097 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-lib-modules\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.428097 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428064 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e1c362ed-687e-4a36-bd04-7adb2e7cbf8b-iptables-alerter-script\") pod \"iptables-alerter-kk6q7\" (UID: \"e1c362ed-687e-4a36-bd04-7adb2e7cbf8b\") " pod="openshift-network-operator/iptables-alerter-kk6q7" Apr 23 08:50:57.428097 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-run-systemd\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428297 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-var-lib-openvswitch\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428297 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh2d2\" (UniqueName: \"kubernetes.io/projected/d1ed91cc-5386-4f96-93e5-b81a3c676537-kube-api-access-gh2d2\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428297 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d13aed77-74bb-4ef7-a5d0-ae7948dc8568-agent-certs\") pod \"konnectivity-agent-2bjvt\" (UID: \"d13aed77-74bb-4ef7-a5d0-ae7948dc8568\") " pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:50:57.428297 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d13aed77-74bb-4ef7-a5d0-ae7948dc8568-konnectivity-ca\") pod \"konnectivity-agent-2bjvt\" (UID: \"d13aed77-74bb-4ef7-a5d0-ae7948dc8568\") " pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:50:57.428297 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-sysconfig\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.428297 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428208 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-registration-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.428297 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-device-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.428297 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428268 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klr8t\" (UniqueName: \"kubernetes.io/projected/5eeeea03-820a-492e-8003-1f0b99cf2826-kube-api-access-klr8t\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.428537 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-systemd-units\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428537 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-tuned\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.428537 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428427 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwccn\" (UniqueName: \"kubernetes.io/projected/91b45c99-408d-4541-b831-3c2a3f9ba542-kube-api-access-qwccn\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.428537 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428451 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-etc-selinux\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.428537 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1c362ed-687e-4a36-bd04-7adb2e7cbf8b-host-slash\") pod \"iptables-alerter-kk6q7\" (UID: \"e1c362ed-687e-4a36-bd04-7adb2e7cbf8b\") " pod="openshift-network-operator/iptables-alerter-kk6q7" Apr 23 08:50:57.428537 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-kubelet\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428537 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-slash\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428541 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-run-netns\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-etc-openvswitch\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428596 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-kubernetes\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-sysctl-conf\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-run\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428674 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvjsm\" (UniqueName: \"kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm\") pod \"network-check-target-kl2k9\" (UID: \"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3\") " pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428710 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkmkp\" (UniqueName: \"kubernetes.io/projected/e1c362ed-687e-4a36-bd04-7adb2e7cbf8b-kube-api-access-rkmkp\") pod \"iptables-alerter-kk6q7\" (UID: \"e1c362ed-687e-4a36-bd04-7adb2e7cbf8b\") " pod="openshift-network-operator/iptables-alerter-kk6q7" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-run-ovn-kubernetes\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1ed91cc-5386-4f96-93e5-b81a3c676537-ovnkube-config\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-systemd\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91b45c99-408d-4541-b831-3c2a3f9ba542-tmp\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.428830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e667463-9112-48df-b2c9-8ff9e9415bce-hosts-file\") pod \"node-resolver-5lf2l\" (UID: \"4e667463-9112-48df-b2c9-8ff9e9415bce\") " pod="openshift-dns/node-resolver-5lf2l" Apr 23 08:50:57.429365 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-run-openvswitch\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.429365 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428869 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-node-log\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.429365 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.429365 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.428991 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1ed91cc-5386-4f96-93e5-b81a3c676537-ovn-node-metrics-cert\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.429365 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.429015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-var-lib-kubelet\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.429365 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.429036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.429365 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.429060 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-socket-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.429365 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.429082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1ed91cc-5386-4f96-93e5-b81a3c676537-env-overrides\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.429365 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.429106 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1ed91cc-5386-4f96-93e5-b81a3c676537-ovnkube-script-lib\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.429365 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.429308 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 08:50:57.429830 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.429791 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ggpnv\"" Apr 23 08:50:57.452988 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.452955 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:45:56 +0000 UTC" deadline="2027-10-07 02:03:36.811862517 +0000 UTC" Apr 23 08:50:57.453308 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.453287 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12761h12m39.358585205s" Apr 23 08:50:57.516848 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.516816 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 08:50:57.530155 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-socket-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.530284 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.530284 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-modprobe-d\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.530284 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-host\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.530284 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-cni-bin\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.530284 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-cni-binary-copy\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.530463 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-host\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.530463 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-socket-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.530463 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-modprobe-d\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.530463 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-cni-bin\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.530463 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.530463 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-498k4\" (UniqueName: \"kubernetes.io/projected/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-kube-api-access-498k4\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.530463 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-multus-cni-dir\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.530463 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-run-multus-certs\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.530701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-sys\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.530701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4e667463-9112-48df-b2c9-8ff9e9415bce-tmp-dir\") pod \"node-resolver-5lf2l\" (UID: \"4e667463-9112-48df-b2c9-8ff9e9415bce\") " pod="openshift-dns/node-resolver-5lf2l" Apr 23 08:50:57.530701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:50:57.530701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-lib-modules\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.530701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-sys\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.530701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530637 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-var-lib-openvswitch\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.530701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gh2d2\" (UniqueName: \"kubernetes.io/projected/d1ed91cc-5386-4f96-93e5-b81a3c676537-kube-api-access-gh2d2\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.530701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-os-release\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.530701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-lib-modules\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.530701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530695 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-cnibin\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530720 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-run-netns\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-var-lib-openvswitch\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-var-lib-cni-multus\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d13aed77-74bb-4ef7-a5d0-ae7948dc8568-konnectivity-ca\") pod \"konnectivity-agent-2bjvt\" (UID: \"d13aed77-74bb-4ef7-a5d0-ae7948dc8568\") " pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-sysconfig\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-sysconfig\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klr8t\" (UniqueName: \"kubernetes.io/projected/5eeeea03-820a-492e-8003-1f0b99cf2826-kube-api-access-klr8t\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-systemd-units\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9t8p\" (UniqueName: \"kubernetes.io/projected/c3c1faf4-8a9e-479e-ac99-dbded210df17-kube-api-access-p9t8p\") pod \"node-ca-56m8r\" (UID: \"c3c1faf4-8a9e-479e-ac99-dbded210df17\") " pod="openshift-image-registry/node-ca-56m8r" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4e667463-9112-48df-b2c9-8ff9e9415bce-tmp-dir\") pod \"node-resolver-5lf2l\" (UID: \"4e667463-9112-48df-b2c9-8ff9e9415bce\") " pod="openshift-dns/node-resolver-5lf2l" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.530996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-system-cni-dir\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-var-lib-kubelet\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-tuned\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-systemd-units\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1c362ed-687e-4a36-bd04-7adb2e7cbf8b-host-slash\") pod \"iptables-alerter-kk6q7\" (UID: \"e1c362ed-687e-4a36-bd04-7adb2e7cbf8b\") " pod="openshift-network-operator/iptables-alerter-kk6q7" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-slash\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-run-netns\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-etc-openvswitch\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531153 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e1c362ed-687e-4a36-bd04-7adb2e7cbf8b-host-slash\") pod \"iptables-alerter-kk6q7\" (UID: \"e1c362ed-687e-4a36-bd04-7adb2e7cbf8b\") " pod="openshift-network-operator/iptables-alerter-kk6q7" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531178 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-sysctl-conf\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-run-netns\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531201 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-slash\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-run\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-etc-openvswitch\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-run\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjsm\" (UniqueName: \"kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm\") pod \"network-check-target-kl2k9\" (UID: \"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3\") " pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkmkp\" (UniqueName: \"kubernetes.io/projected/e1c362ed-687e-4a36-bd04-7adb2e7cbf8b-kube-api-access-rkmkp\") pod \"iptables-alerter-kk6q7\" (UID: \"e1c362ed-687e-4a36-bd04-7adb2e7cbf8b\") " pod="openshift-network-operator/iptables-alerter-kk6q7" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531358 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-run-ovn-kubernetes\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1ed91cc-5386-4f96-93e5-b81a3c676537-ovnkube-config\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531372 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d13aed77-74bb-4ef7-a5d0-ae7948dc8568-konnectivity-ca\") pod \"konnectivity-agent-2bjvt\" (UID: \"d13aed77-74bb-4ef7-a5d0-ae7948dc8568\") " pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531391 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-run-ovn-kubernetes\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.531888 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531358 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-sysctl-conf\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-multus-socket-dir-parent\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e667463-9112-48df-b2c9-8ff9e9415bce-hosts-file\") pod \"node-resolver-5lf2l\" (UID: \"4e667463-9112-48df-b2c9-8ff9e9415bce\") " pod="openshift-dns/node-resolver-5lf2l" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-run-openvswitch\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531582 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-var-lib-cni-bin\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531601 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0da171ba-bef9-4402-936e-2d5afc07a732-multus-daemon-config\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531627 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-etc-kubernetes\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-run-openvswitch\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531654 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4e667463-9112-48df-b2c9-8ff9e9415bce-hosts-file\") pod \"node-resolver-5lf2l\" (UID: \"4e667463-9112-48df-b2c9-8ff9e9415bce\") " pod="openshift-dns/node-resolver-5lf2l" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1ed91cc-5386-4f96-93e5-b81a3c676537-env-overrides\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531704 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1ed91cc-5386-4f96-93e5-b81a3c676537-ovnkube-script-lib\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531711 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531719 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-hostroot\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-sys-fs\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531761 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-run-ovn\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmgbb\" (UniqueName: \"kubernetes.io/projected/c32e908b-8a1f-4d28-99e1-dce39209186a-kube-api-access-nmgbb\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:50:57.532563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cb8pg\" (UniqueName: \"kubernetes.io/projected/4e667463-9112-48df-b2c9-8ff9e9415bce-kube-api-access-cb8pg\") pod \"node-resolver-5lf2l\" (UID: \"4e667463-9112-48df-b2c9-8ff9e9415bce\") " pod="openshift-dns/node-resolver-5lf2l" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-log-socket\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-run-ovn\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531846 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-cni-netd\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-sys-fs\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-cnibin\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531950 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-log-socket\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531951 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1ed91cc-5386-4f96-93e5-b81a3c676537-ovnkube-config\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-sysctl-d\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e1c362ed-687e-4a36-bd04-7adb2e7cbf8b-iptables-alerter-script\") pod \"iptables-alerter-kk6q7\" (UID: \"e1c362ed-687e-4a36-bd04-7adb2e7cbf8b\") " pod="openshift-network-operator/iptables-alerter-kk6q7" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-run-systemd\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m95rl\" (UniqueName: \"kubernetes.io/projected/0da171ba-bef9-4402-936e-2d5afc07a732-kube-api-access-m95rl\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d13aed77-74bb-4ef7-a5d0-ae7948dc8568-agent-certs\") pod \"konnectivity-agent-2bjvt\" (UID: \"d13aed77-74bb-4ef7-a5d0-ae7948dc8568\") " pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532100 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-sysctl-d\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-registration-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-run-systemd\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.531991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-cni-netd\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.533311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-device-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532153 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-registration-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-node-log\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-device-dir\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532216 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1ed91cc-5386-4f96-93e5-b81a3c676537-ovnkube-script-lib\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532222 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1ed91cc-5386-4f96-93e5-b81a3c676537-ovn-node-metrics-cert\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1ed91cc-5386-4f96-93e5-b81a3c676537-env-overrides\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-node-log\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532262 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-system-cni-dir\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532296 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwccn\" (UniqueName: \"kubernetes.io/projected/91b45c99-408d-4541-b831-3c2a3f9ba542-kube-api-access-qwccn\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-etc-selinux\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532345 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-kubelet\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532370 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c3c1faf4-8a9e-479e-ac99-dbded210df17-serviceca\") pod \"node-ca-56m8r\" (UID: \"c3c1faf4-8a9e-479e-ac99-dbded210df17\") " pod="openshift-image-registry/node-ca-56m8r" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-run-k8s-cni-cncf-io\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532402 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-kubelet\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-multus-conf-dir\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5eeeea03-820a-492e-8003-1f0b99cf2826-etc-selinux\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.534089 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-kubernetes\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3c1faf4-8a9e-479e-ac99-dbded210df17-host\") pod \"node-ca-56m8r\" (UID: \"c3c1faf4-8a9e-479e-ac99-dbded210df17\") " pod="openshift-image-registry/node-ca-56m8r" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-os-release\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-systemd\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532582 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-kubernetes\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-systemd\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91b45c99-408d-4541-b831-3c2a3f9ba542-tmp\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0da171ba-bef9-4402-936e-2d5afc07a732-cni-binary-copy\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-var-lib-kubelet\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532742 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ed91cc-5386-4f96-93e5-b81a3c676537-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.532810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91b45c99-408d-4541-b831-3c2a3f9ba542-var-lib-kubelet\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.534810 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.533123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e1c362ed-687e-4a36-bd04-7adb2e7cbf8b-iptables-alerter-script\") pod \"iptables-alerter-kk6q7\" (UID: \"e1c362ed-687e-4a36-bd04-7adb2e7cbf8b\") " pod="openshift-network-operator/iptables-alerter-kk6q7" Apr 23 08:50:57.535199 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.535086 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/91b45c99-408d-4541-b831-3c2a3f9ba542-etc-tuned\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.535199 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.535109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1ed91cc-5386-4f96-93e5-b81a3c676537-ovn-node-metrics-cert\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.535261 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.535233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d13aed77-74bb-4ef7-a5d0-ae7948dc8568-agent-certs\") pod \"konnectivity-agent-2bjvt\" (UID: \"d13aed77-74bb-4ef7-a5d0-ae7948dc8568\") " pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:50:57.536411 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.536391 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91b45c99-408d-4541-b831-3c2a3f9ba542-tmp\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.537960 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:57.537284 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:57.537960 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:57.537311 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:57.537960 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:57.537325 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rvjsm for pod openshift-network-diagnostics/network-check-target-kl2k9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:57.537960 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:57.537434 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm podName:81dd2f7c-f618-4c84-81fd-ff2be1c08dc3 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:58.037402913 +0000 UTC m=+3.012710532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rvjsm" (UniqueName: "kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm") pod "network-check-target-kl2k9" (UID: "81dd2f7c-f618-4c84-81fd-ff2be1c08dc3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:57.538818 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.538764 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh2d2\" (UniqueName: \"kubernetes.io/projected/d1ed91cc-5386-4f96-93e5-b81a3c676537-kube-api-access-gh2d2\") pod \"ovnkube-node-jtcqz\" (UID: \"d1ed91cc-5386-4f96-93e5-b81a3c676537\") " pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.538818 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.538774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klr8t\" (UniqueName: \"kubernetes.io/projected/5eeeea03-820a-492e-8003-1f0b99cf2826-kube-api-access-klr8t\") pod \"aws-ebs-csi-driver-node-7ljrq\" (UID: \"5eeeea03-820a-492e-8003-1f0b99cf2826\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.538998 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.538977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkmkp\" (UniqueName: \"kubernetes.io/projected/e1c362ed-687e-4a36-bd04-7adb2e7cbf8b-kube-api-access-rkmkp\") pod \"iptables-alerter-kk6q7\" (UID: \"e1c362ed-687e-4a36-bd04-7adb2e7cbf8b\") " pod="openshift-network-operator/iptables-alerter-kk6q7" Apr 23 08:50:57.539379 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.539331 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb8pg\" (UniqueName: \"kubernetes.io/projected/4e667463-9112-48df-b2c9-8ff9e9415bce-kube-api-access-cb8pg\") pod \"node-resolver-5lf2l\" (UID: \"4e667463-9112-48df-b2c9-8ff9e9415bce\") " pod="openshift-dns/node-resolver-5lf2l" Apr 23 08:50:57.539891 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.539870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwccn\" (UniqueName: \"kubernetes.io/projected/91b45c99-408d-4541-b831-3c2a3f9ba542-kube-api-access-qwccn\") pod \"tuned-tm68n\" (UID: \"91b45c99-408d-4541-b831-3c2a3f9ba542\") " pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.633875 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.633839 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:50:57.633875 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.633881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-os-release\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.634109 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.633922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-cnibin\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634109 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.633946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-run-netns\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634109 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.633969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-var-lib-cni-multus\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634109 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.633995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9t8p\" (UniqueName: \"kubernetes.io/projected/c3c1faf4-8a9e-479e-ac99-dbded210df17-kube-api-access-p9t8p\") pod \"node-ca-56m8r\" (UID: \"c3c1faf4-8a9e-479e-ac99-dbded210df17\") " pod="openshift-image-registry/node-ca-56m8r" Apr 23 08:50:57.634109 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:57.634009 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:57.634109 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-cnibin\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634109 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-os-release\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.634109 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634067 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-var-lib-cni-multus\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634109 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-run-netns\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634109 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-system-cni-dir\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634109 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:57.634091 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs podName:c32e908b-8a1f-4d28-99e1-dce39209186a nodeName:}" failed. No retries permitted until 2026-04-23 08:50:58.134072835 +0000 UTC m=+3.109380444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs") pod "network-metrics-daemon-9tmnv" (UID: "c32e908b-8a1f-4d28-99e1-dce39209186a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-var-lib-kubelet\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-system-cni-dir\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-multus-socket-dir-parent\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-multus-socket-dir-parent\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-var-lib-kubelet\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634234 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-var-lib-cni-bin\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634280 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-var-lib-cni-bin\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0da171ba-bef9-4402-936e-2d5afc07a732-multus-daemon-config\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-etc-kubernetes\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-hostroot\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634353 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-etc-kubernetes\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmgbb\" (UniqueName: \"kubernetes.io/projected/c32e908b-8a1f-4d28-99e1-dce39209186a-kube-api-access-nmgbb\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-cnibin\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634430 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m95rl\" (UniqueName: \"kubernetes.io/projected/0da171ba-bef9-4402-936e-2d5afc07a732-kube-api-access-m95rl\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634457 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-system-cni-dir\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c3c1faf4-8a9e-479e-ac99-dbded210df17-serviceca\") pod \"node-ca-56m8r\" (UID: \"c3c1faf4-8a9e-479e-ac99-dbded210df17\") " pod="openshift-image-registry/node-ca-56m8r" Apr 23 08:50:57.634727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-run-k8s-cni-cncf-io\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-multus-conf-dir\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634555 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3c1faf4-8a9e-479e-ac99-dbded210df17-host\") pod \"node-ca-56m8r\" (UID: \"c3c1faf4-8a9e-479e-ac99-dbded210df17\") " pod="openshift-image-registry/node-ca-56m8r" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-os-release\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0da171ba-bef9-4402-936e-2d5afc07a732-cni-binary-copy\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-system-cni-dir\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634681 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634402 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-hostroot\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-cnibin\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634815 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-cni-binary-copy\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0da171ba-bef9-4402-936e-2d5afc07a732-multus-daemon-config\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-498k4\" (UniqueName: \"kubernetes.io/projected/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-kube-api-access-498k4\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-multus-cni-dir\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-run-multus-certs\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.635527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.634985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3c1faf4-8a9e-479e-ac99-dbded210df17-host\") pod \"node-ca-56m8r\" (UID: \"c3c1faf4-8a9e-479e-ac99-dbded210df17\") " pod="openshift-image-registry/node-ca-56m8r" Apr 23 08:50:57.636172 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.635005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-run-multus-certs\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.636172 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.635028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-host-run-k8s-cni-cncf-io\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.636172 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.635060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c3c1faf4-8a9e-479e-ac99-dbded210df17-serviceca\") pod \"node-ca-56m8r\" (UID: \"c3c1faf4-8a9e-479e-ac99-dbded210df17\") " pod="openshift-image-registry/node-ca-56m8r" Apr 23 08:50:57.636172 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.635113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-os-release\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.636172 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.635122 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-multus-conf-dir\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.636172 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.635173 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0da171ba-bef9-4402-936e-2d5afc07a732-multus-cni-dir\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.636172 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.635414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.636172 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.635474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0da171ba-bef9-4402-936e-2d5afc07a732-cni-binary-copy\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.636172 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.635736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-cni-binary-copy\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.642936 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.642889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9t8p\" (UniqueName: \"kubernetes.io/projected/c3c1faf4-8a9e-479e-ac99-dbded210df17-kube-api-access-p9t8p\") pod \"node-ca-56m8r\" (UID: \"c3c1faf4-8a9e-479e-ac99-dbded210df17\") " pod="openshift-image-registry/node-ca-56m8r" Apr 23 08:50:57.643270 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.643238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-498k4\" (UniqueName: \"kubernetes.io/projected/35e62b41-5dc1-4f18-a2d1-a4c01ace11a3-kube-api-access-498k4\") pod \"multus-additional-cni-plugins-c6cqt\" (UID: \"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3\") " pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.643453 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.643429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m95rl\" (UniqueName: \"kubernetes.io/projected/0da171ba-bef9-4402-936e-2d5afc07a732-kube-api-access-m95rl\") pod \"multus-rt965\" (UID: \"0da171ba-bef9-4402-936e-2d5afc07a732\") " pod="openshift-multus/multus-rt965" Apr 23 08:50:57.643533 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.643514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmgbb\" (UniqueName: \"kubernetes.io/projected/c32e908b-8a1f-4d28-99e1-dce39209186a-kube-api-access-nmgbb\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:50:57.715473 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.715393 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" Apr 23 08:50:57.724269 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.724246 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5lf2l" Apr 23 08:50:57.733690 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.733672 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kk6q7" Apr 23 08:50:57.739341 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.739324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:50:57.744837 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.744818 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:50:57.751413 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.751393 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tm68n" Apr 23 08:50:57.759909 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.759877 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-56m8r" Apr 23 08:50:57.766406 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.766391 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c6cqt" Apr 23 08:50:57.770920 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:57.770886 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rt965" Apr 23 08:50:58.085615 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:58.085581 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c1faf4_8a9e_479e_ac99_dbded210df17.slice/crio-0b88fd652fc3b5739f96150f38471f66c370a559c4309095678e0c4d4fdfd229 WatchSource:0}: Error finding container 0b88fd652fc3b5739f96150f38471f66c370a559c4309095678e0c4d4fdfd229: Status 404 returned error can't find the container with id 0b88fd652fc3b5739f96150f38471f66c370a559c4309095678e0c4d4fdfd229 Apr 23 08:50:58.086635 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:58.086598 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd13aed77_74bb_4ef7_a5d0_ae7948dc8568.slice/crio-44a498ffb431f0e20615b9d272c381c2e37453b6ae6fcf6a379a2aef98f0e05f WatchSource:0}: Error finding container 44a498ffb431f0e20615b9d272c381c2e37453b6ae6fcf6a379a2aef98f0e05f: Status 404 returned error can't find the container with id 44a498ffb431f0e20615b9d272c381c2e37453b6ae6fcf6a379a2aef98f0e05f Apr 23 08:50:58.088018 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:58.087867 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e667463_9112_48df_b2c9_8ff9e9415bce.slice/crio-1a5aece2c6292dd38768ec63c89b93405c0d5983c5446f8b5fa90daa4c771b3e WatchSource:0}: Error finding container 1a5aece2c6292dd38768ec63c89b93405c0d5983c5446f8b5fa90daa4c771b3e: Status 404 returned error can't find the container with id 1a5aece2c6292dd38768ec63c89b93405c0d5983c5446f8b5fa90daa4c771b3e Apr 23 08:50:58.090758 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:58.090718 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ed91cc_5386_4f96_93e5_b81a3c676537.slice/crio-689a1324cdacfad09142f640dab9d035ea00a3d5df987be7bfa88286ecddb16a WatchSource:0}: Error finding container 689a1324cdacfad09142f640dab9d035ea00a3d5df987be7bfa88286ecddb16a: Status 404 returned error can't find the container with id 689a1324cdacfad09142f640dab9d035ea00a3d5df987be7bfa88286ecddb16a Apr 23 08:50:58.091925 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:58.091866 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eeeea03_820a_492e_8003_1f0b99cf2826.slice/crio-6bb6f078812751f730d57bac1df4d2c7dc13c8246888a1f5db2d7bc662044681 WatchSource:0}: Error finding container 6bb6f078812751f730d57bac1df4d2c7dc13c8246888a1f5db2d7bc662044681: Status 404 returned error can't find the container with id 6bb6f078812751f730d57bac1df4d2c7dc13c8246888a1f5db2d7bc662044681 Apr 23 08:50:58.093030 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:58.093011 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b45c99_408d_4541_b831_3c2a3f9ba542.slice/crio-7a9db8fc40eb8a044afbd4d8406db415a5db65951724e0f3dcca978742e96e3e WatchSource:0}: Error finding container 7a9db8fc40eb8a044afbd4d8406db415a5db65951724e0f3dcca978742e96e3e: Status 404 returned error can't find the container with id 7a9db8fc40eb8a044afbd4d8406db415a5db65951724e0f3dcca978742e96e3e Apr 23 08:50:58.093993 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:58.093968 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0da171ba_bef9_4402_936e_2d5afc07a732.slice/crio-491d1e6cc73af098668fdc01607738cdd7026174933703df9b8b82410a1ea7ae WatchSource:0}: Error finding container 491d1e6cc73af098668fdc01607738cdd7026174933703df9b8b82410a1ea7ae: Status 404 returned error can't find the container with id 491d1e6cc73af098668fdc01607738cdd7026174933703df9b8b82410a1ea7ae Apr 23 08:50:58.095064 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:58.094952 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35e62b41_5dc1_4f18_a2d1_a4c01ace11a3.slice/crio-9124053b6402f19ab599b4f11513c289464ab578f7cb2ae7d67ed8f12f95ed67 WatchSource:0}: Error finding container 9124053b6402f19ab599b4f11513c289464ab578f7cb2ae7d67ed8f12f95ed67: Status 404 returned error can't find the container with id 9124053b6402f19ab599b4f11513c289464ab578f7cb2ae7d67ed8f12f95ed67 Apr 23 08:50:58.097431 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:50:58.097080 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1c362ed_687e_4a36_bd04_7adb2e7cbf8b.slice/crio-20f75eeb53d7ad558ac15cf5a0773ebe56fa36713f1311107d766e5a982c4a2d WatchSource:0}: Error finding container 20f75eeb53d7ad558ac15cf5a0773ebe56fa36713f1311107d766e5a982c4a2d: Status 404 returned error can't find the container with id 20f75eeb53d7ad558ac15cf5a0773ebe56fa36713f1311107d766e5a982c4a2d Apr 23 08:50:58.137281 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.137138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjsm\" (UniqueName: \"kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm\") pod \"network-check-target-kl2k9\" (UID: \"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3\") " pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:50:58.137281 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:58.137279 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:58.137407 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:58.137295 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:58.137407 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:58.137304 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rvjsm for pod openshift-network-diagnostics/network-check-target-kl2k9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:58.137407 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.137316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:50:58.137407 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:58.137342 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm podName:81dd2f7c-f618-4c84-81fd-ff2be1c08dc3 nodeName:}" failed. No retries permitted until 2026-04-23 08:50:59.137329173 +0000 UTC m=+4.112636778 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvjsm" (UniqueName: "kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm") pod "network-check-target-kl2k9" (UID: "81dd2f7c-f618-4c84-81fd-ff2be1c08dc3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:58.137407 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:58.137388 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:58.137591 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:58.137425 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs podName:c32e908b-8a1f-4d28-99e1-dce39209186a nodeName:}" failed. No retries permitted until 2026-04-23 08:50:59.137414719 +0000 UTC m=+4.112722324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs") pod "network-metrics-daemon-9tmnv" (UID: "c32e908b-8a1f-4d28-99e1-dce39209186a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:58.454483 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.454373 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 08:45:56 +0000 UTC" deadline="2027-11-03 02:57:08.37845723 +0000 UTC" Apr 23 08:50:58.454483 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.454405 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13410h6m9.92405556s" Apr 23 08:50:58.561913 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.561856 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-250.ec2.internal" event={"ID":"4720a5d3bfeaf0855890af875d341e36","Type":"ContainerStarted","Data":"4a89e9fcbf56c7390980e582ebe04c535516b2c05849def6080bf30ca116d782"} Apr 23 08:50:58.564847 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.564786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kk6q7" event={"ID":"e1c362ed-687e-4a36-bd04-7adb2e7cbf8b","Type":"ContainerStarted","Data":"20f75eeb53d7ad558ac15cf5a0773ebe56fa36713f1311107d766e5a982c4a2d"} Apr 23 08:50:58.568913 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.568837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c6cqt" event={"ID":"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3","Type":"ContainerStarted","Data":"9124053b6402f19ab599b4f11513c289464ab578f7cb2ae7d67ed8f12f95ed67"} Apr 23 08:50:58.570754 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.570699 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rt965" event={"ID":"0da171ba-bef9-4402-936e-2d5afc07a732","Type":"ContainerStarted","Data":"491d1e6cc73af098668fdc01607738cdd7026174933703df9b8b82410a1ea7ae"} Apr 23 08:50:58.577075 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.576421 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-250.ec2.internal" podStartSLOduration=2.576406881 podStartE2EDuration="2.576406881s" podCreationTimestamp="2026-04-23 08:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:50:58.576023976 +0000 UTC m=+3.551331603" watchObservedRunningTime="2026-04-23 08:50:58.576406881 +0000 UTC m=+3.551714510" Apr 23 08:50:58.592610 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.592585 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" event={"ID":"d1ed91cc-5386-4f96-93e5-b81a3c676537","Type":"ContainerStarted","Data":"689a1324cdacfad09142f640dab9d035ea00a3d5df987be7bfa88286ecddb16a"} Apr 23 08:50:58.596984 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.596958 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2bjvt" event={"ID":"d13aed77-74bb-4ef7-a5d0-ae7948dc8568","Type":"ContainerStarted","Data":"44a498ffb431f0e20615b9d272c381c2e37453b6ae6fcf6a379a2aef98f0e05f"} Apr 23 08:50:58.611166 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.611117 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tm68n" event={"ID":"91b45c99-408d-4541-b831-3c2a3f9ba542","Type":"ContainerStarted","Data":"7a9db8fc40eb8a044afbd4d8406db415a5db65951724e0f3dcca978742e96e3e"} Apr 23 08:50:58.614143 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.614094 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" event={"ID":"5eeeea03-820a-492e-8003-1f0b99cf2826","Type":"ContainerStarted","Data":"6bb6f078812751f730d57bac1df4d2c7dc13c8246888a1f5db2d7bc662044681"} Apr 23 08:50:58.622401 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.622344 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5lf2l" event={"ID":"4e667463-9112-48df-b2c9-8ff9e9415bce","Type":"ContainerStarted","Data":"1a5aece2c6292dd38768ec63c89b93405c0d5983c5446f8b5fa90daa4c771b3e"} Apr 23 08:50:58.628298 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:58.628233 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-56m8r" event={"ID":"c3c1faf4-8a9e-479e-ac99-dbded210df17","Type":"ContainerStarted","Data":"0b88fd652fc3b5739f96150f38471f66c370a559c4309095678e0c4d4fdfd229"} Apr 23 08:50:59.145726 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:59.145687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjsm\" (UniqueName: \"kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm\") pod \"network-check-target-kl2k9\" (UID: \"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3\") " pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:50:59.145921 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:59.145772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:50:59.145921 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:59.145911 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:59.146037 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:59.145973 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs podName:c32e908b-8a1f-4d28-99e1-dce39209186a nodeName:}" failed. No retries permitted until 2026-04-23 08:51:01.145955516 +0000 UTC m=+6.121263135 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs") pod "network-metrics-daemon-9tmnv" (UID: "c32e908b-8a1f-4d28-99e1-dce39209186a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:50:59.146371 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:59.146354 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:50:59.146433 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:59.146380 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:50:59.146433 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:59.146393 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rvjsm for pod openshift-network-diagnostics/network-check-target-kl2k9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:59.146433 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:59.146431 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm podName:81dd2f7c-f618-4c84-81fd-ff2be1c08dc3 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:01.146419044 +0000 UTC m=+6.121726655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvjsm" (UniqueName: "kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm") pod "network-check-target-kl2k9" (UID: "81dd2f7c-f618-4c84-81fd-ff2be1c08dc3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:50:59.541872 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:59.541189 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:50:59.541872 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:59.541332 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:50:59.541872 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:59.541741 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:50:59.541872 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:50:59.541831 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:50:59.651391 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:59.651353 2575 generic.go:358] "Generic (PLEG): container finished" podID="4a9276482c5142a05a682b295ad6ba37" containerID="3027081e6ec265c46d79035a75443be80fc2cc25d11a08585318f96191d15680" exitCode=0 Apr 23 08:50:59.651578 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:50:59.651477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" event={"ID":"4a9276482c5142a05a682b295ad6ba37","Type":"ContainerDied","Data":"3027081e6ec265c46d79035a75443be80fc2cc25d11a08585318f96191d15680"} Apr 23 08:51:00.657090 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:00.657034 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" event={"ID":"4a9276482c5142a05a682b295ad6ba37","Type":"ContainerStarted","Data":"354d7d42648848a1300a230598275f954710e725a7c9e2c31cb74e407220125e"} Apr 23 08:51:01.164655 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:01.164618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:01.164844 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:01.164689 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjsm\" (UniqueName: \"kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm\") pod \"network-check-target-kl2k9\" (UID: \"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3\") " pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:01.164844 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:01.164726 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:51:01.164844 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:01.164786 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs podName:c32e908b-8a1f-4d28-99e1-dce39209186a nodeName:}" failed. No retries permitted until 2026-04-23 08:51:05.164769071 +0000 UTC m=+10.140076692 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs") pod "network-metrics-daemon-9tmnv" (UID: "c32e908b-8a1f-4d28-99e1-dce39209186a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:51:01.164844 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:01.164786 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:51:01.164844 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:01.164807 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:51:01.164844 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:01.164819 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rvjsm for pod openshift-network-diagnostics/network-check-target-kl2k9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:01.164844 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:01.164850 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm podName:81dd2f7c-f618-4c84-81fd-ff2be1c08dc3 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:05.164840144 +0000 UTC m=+10.140147764 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvjsm" (UniqueName: "kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm") pod "network-check-target-kl2k9" (UID: "81dd2f7c-f618-4c84-81fd-ff2be1c08dc3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:01.540971 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:01.540310 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:01.540971 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:01.540516 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:01.540971 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:01.540626 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:01.540971 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:01.540722 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:03.540248 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:03.540214 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:03.540699 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:03.540335 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:03.540760 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:03.540741 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:03.540888 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:03.540863 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:05.202572 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:05.202533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:05.203064 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:05.202602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjsm\" (UniqueName: \"kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm\") pod \"network-check-target-kl2k9\" (UID: \"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3\") " pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:05.203064 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:05.202722 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:51:05.203064 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:05.202768 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:51:05.203064 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:05.202791 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:51:05.203064 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:05.202806 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rvjsm for pod openshift-network-diagnostics/network-check-target-kl2k9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:05.203064 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:05.202807 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs podName:c32e908b-8a1f-4d28-99e1-dce39209186a nodeName:}" failed. No retries permitted until 2026-04-23 08:51:13.202786754 +0000 UTC m=+18.178094373 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs") pod "network-metrics-daemon-9tmnv" (UID: "c32e908b-8a1f-4d28-99e1-dce39209186a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:51:05.203064 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:05.202866 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm podName:81dd2f7c-f618-4c84-81fd-ff2be1c08dc3 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:13.20284175 +0000 UTC m=+18.178149367 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvjsm" (UniqueName: "kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm") pod "network-check-target-kl2k9" (UID: "81dd2f7c-f618-4c84-81fd-ff2be1c08dc3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:05.541209 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:05.541173 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:05.541397 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:05.541284 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:05.541397 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:05.541374 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:05.541517 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:05.541489 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:07.540366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:07.540330 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:07.540817 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:07.540459 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:07.540817 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:07.540494 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:07.540817 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:07.540582 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:09.539764 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:09.539736 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:09.540209 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:09.539843 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:09.540209 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:09.539923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:09.540209 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:09.540055 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:11.539954 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:11.539916 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:11.540503 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:11.539916 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:11.540503 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:11.540042 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:11.540503 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:11.540125 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:13.257762 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:13.257726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:13.258264 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:13.257775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjsm\" (UniqueName: \"kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm\") pod \"network-check-target-kl2k9\" (UID: \"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3\") " pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:13.258264 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:13.257877 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:51:13.258264 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:13.257889 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:51:13.258264 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:13.257919 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:51:13.258264 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:13.257931 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rvjsm for pod openshift-network-diagnostics/network-check-target-kl2k9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:13.258264 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:13.257960 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs podName:c32e908b-8a1f-4d28-99e1-dce39209186a nodeName:}" failed. No retries permitted until 2026-04-23 08:51:29.257939737 +0000 UTC m=+34.233247353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs") pod "network-metrics-daemon-9tmnv" (UID: "c32e908b-8a1f-4d28-99e1-dce39209186a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:51:13.258264 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:13.257976 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm podName:81dd2f7c-f618-4c84-81fd-ff2be1c08dc3 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:29.257969774 +0000 UTC m=+34.233277382 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvjsm" (UniqueName: "kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm") pod "network-check-target-kl2k9" (UID: "81dd2f7c-f618-4c84-81fd-ff2be1c08dc3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:13.539616 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:13.539534 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:13.539771 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:13.539667 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:13.539771 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:13.539724 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:13.539872 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:13.539847 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:15.541057 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.540735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:15.541754 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:15.541108 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:15.541754 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.540764 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:15.541754 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:15.541259 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:15.689117 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.689081 2575 generic.go:358] "Generic (PLEG): container finished" podID="35e62b41-5dc1-4f18-a2d1-a4c01ace11a3" containerID="7a7814797ce444e8289cd1273e4a0cd1233fa6caa1e0885a013e90d4255317a6" exitCode=0 Apr 23 08:51:15.689262 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.689171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c6cqt" event={"ID":"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3","Type":"ContainerDied","Data":"7a7814797ce444e8289cd1273e4a0cd1233fa6caa1e0885a013e90d4255317a6"} Apr 23 08:51:15.691551 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.691519 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rt965" event={"ID":"0da171ba-bef9-4402-936e-2d5afc07a732","Type":"ContainerStarted","Data":"16a60c00782a560b764eb796458ab79f8d7af4b570be19a74d25fbdc084e52e0"} Apr 23 08:51:15.695199 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.695175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" event={"ID":"d1ed91cc-5386-4f96-93e5-b81a3c676537","Type":"ContainerStarted","Data":"2dc3c77154ce34b05a3e05d31623cce27b2b940934787ffa47c648b68f82801d"} Apr 23 08:51:15.695302 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.695209 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" event={"ID":"d1ed91cc-5386-4f96-93e5-b81a3c676537","Type":"ContainerStarted","Data":"aa1f2ffa92431a9a64cc075c45bb83cbdd661ab47cf7aebc7bed8048e94da521"} Apr 23 08:51:15.695302 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.695225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" event={"ID":"d1ed91cc-5386-4f96-93e5-b81a3c676537","Type":"ContainerStarted","Data":"5c567af9b28917f751042fc5e4b97dcdbe3a19f451e2e27dd849bc5d5e3293bc"} Apr 23 08:51:15.695302 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.695237 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" event={"ID":"d1ed91cc-5386-4f96-93e5-b81a3c676537","Type":"ContainerStarted","Data":"54eda9b16e18274ca5109046595d0ec5dcd426ef3eb7be1cd57e1ac91f0a6dce"} Apr 23 08:51:15.695302 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.695250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" event={"ID":"d1ed91cc-5386-4f96-93e5-b81a3c676537","Type":"ContainerStarted","Data":"c4646d50def9d6fea95d0c332ab98d38a5b69e1e08f402ce9439e4c56c150214"} Apr 23 08:51:15.696381 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.696354 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2bjvt" event={"ID":"d13aed77-74bb-4ef7-a5d0-ae7948dc8568","Type":"ContainerStarted","Data":"ac7a5afa98fd92abedff871ae9733556b9cb04428b954ab90d5c6247b5290ab7"} Apr 23 08:51:15.697587 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.697569 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tm68n" event={"ID":"91b45c99-408d-4541-b831-3c2a3f9ba542","Type":"ContainerStarted","Data":"5f4e1ab5f3e2a29d940068368a2e6c1867bff28f91e4b774021cf07ffe93aecc"} Apr 23 08:51:15.698788 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.698761 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" event={"ID":"5eeeea03-820a-492e-8003-1f0b99cf2826","Type":"ContainerStarted","Data":"fce91de90dee9ebb2aa6ef4ae3b4a2fb19311cb25766dd2107defac6110a42ce"} Apr 23 08:51:15.699949 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.699927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5lf2l" event={"ID":"4e667463-9112-48df-b2c9-8ff9e9415bce","Type":"ContainerStarted","Data":"0043182a87241c5bc155e1314701a72b9d8356bb2cae135e771bbd337932eb1e"} Apr 23 08:51:15.701212 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.701062 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-56m8r" event={"ID":"c3c1faf4-8a9e-479e-ac99-dbded210df17","Type":"ContainerStarted","Data":"3ab5e4317e87ad510e05d09362c00c0dc26a4f114362a7c6e65dc1944268f587"} Apr 23 08:51:15.711884 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.711846 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-250.ec2.internal" podStartSLOduration=19.711832646 podStartE2EDuration="19.711832646s" podCreationTimestamp="2026-04-23 08:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:51:00.674137404 +0000 UTC m=+5.649445033" watchObservedRunningTime="2026-04-23 08:51:15.711832646 +0000 UTC m=+20.687140273" Apr 23 08:51:15.726158 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.726110 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5lf2l" podStartSLOduration=4.044603855 podStartE2EDuration="20.726096983s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:50:58.089695783 +0000 UTC m=+3.065003390" lastFinishedPulling="2026-04-23 08:51:14.771188907 +0000 UTC m=+19.746496518" observedRunningTime="2026-04-23 08:51:15.725828717 +0000 UTC m=+20.701136345" watchObservedRunningTime="2026-04-23 08:51:15.726096983 +0000 UTC m=+20.701404609" Apr 23 08:51:15.743005 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.742961 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2bjvt" podStartSLOduration=11.998609465 podStartE2EDuration="20.742945757s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:50:58.089859459 +0000 UTC m=+3.065167068" lastFinishedPulling="2026-04-23 08:51:06.83419574 +0000 UTC m=+11.809503360" observedRunningTime="2026-04-23 08:51:15.742732884 +0000 UTC m=+20.718040511" watchObservedRunningTime="2026-04-23 08:51:15.742945757 +0000 UTC m=+20.718253383" Apr 23 08:51:15.760157 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.760110 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tm68n" podStartSLOduration=4.079415278 podStartE2EDuration="20.760098159s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:50:58.095369393 +0000 UTC m=+3.070677016" lastFinishedPulling="2026-04-23 08:51:14.776052278 +0000 UTC m=+19.751359897" observedRunningTime="2026-04-23 08:51:15.760015417 +0000 UTC m=+20.735323037" watchObservedRunningTime="2026-04-23 08:51:15.760098159 +0000 UTC m=+20.735405786" Apr 23 08:51:15.773318 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.773278 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-56m8r" podStartSLOduration=4.090020287 podStartE2EDuration="20.773266613s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:50:58.087970382 +0000 UTC m=+3.063277988" lastFinishedPulling="2026-04-23 08:51:14.771216706 +0000 UTC m=+19.746524314" observedRunningTime="2026-04-23 08:51:15.772689106 +0000 UTC m=+20.747996735" watchObservedRunningTime="2026-04-23 08:51:15.773266613 +0000 UTC m=+20.748574240" Apr 23 08:51:15.788190 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:15.788155 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rt965" podStartSLOduration=4.086547518 podStartE2EDuration="20.788144003s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:50:58.096130749 +0000 UTC m=+3.071438359" lastFinishedPulling="2026-04-23 08:51:14.797727238 +0000 UTC m=+19.773034844" observedRunningTime="2026-04-23 08:51:15.787970261 +0000 UTC m=+20.763277900" watchObservedRunningTime="2026-04-23 08:51:15.788144003 +0000 UTC m=+20.763451628" Apr 23 08:51:16.026759 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:16.026726 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 08:51:16.485331 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:16.485250 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T08:51:16.026747108Z","UUID":"c96a6da0-617f-4525-ba00-b7e149ff451f","Handler":null,"Name":"","Endpoint":""} Apr 23 08:51:16.488850 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:16.488619 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 08:51:16.488850 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:16.488649 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 08:51:16.707328 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:16.707287 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" event={"ID":"5eeeea03-820a-492e-8003-1f0b99cf2826","Type":"ContainerStarted","Data":"72cd18b68e7f2cfa8ae9b68704d993fb866b97c7281342c7f9399e0eac93e6bb"} Apr 23 08:51:16.709212 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:16.709180 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kk6q7" event={"ID":"e1c362ed-687e-4a36-bd04-7adb2e7cbf8b","Type":"ContainerStarted","Data":"8b8da2d8113dc9dad9a5b762aa3cc8d3637c4725e8113e5d7ea5d41b6febc82d"} Apr 23 08:51:16.712567 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:16.712531 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" event={"ID":"d1ed91cc-5386-4f96-93e5-b81a3c676537","Type":"ContainerStarted","Data":"be813e1509c3bba15e7fc9416ddd35cd78bdb234622a878f365cd82ab971983d"} Apr 23 08:51:16.782664 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:16.782582 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:51:16.783531 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:16.783508 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:51:16.803192 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:16.803140 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kk6q7" podStartSLOduration=5.130465244 podStartE2EDuration="21.803122871s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:50:58.098562264 +0000 UTC m=+3.073869870" lastFinishedPulling="2026-04-23 08:51:14.77121988 +0000 UTC m=+19.746527497" observedRunningTime="2026-04-23 08:51:16.724365499 +0000 UTC m=+21.699673128" watchObservedRunningTime="2026-04-23 08:51:16.803122871 +0000 UTC m=+21.778430499" Apr 23 08:51:17.452409 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:17.452335 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:51:17.452409 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:17.452398 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2bjvt" Apr 23 08:51:17.539998 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:17.539921 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:17.539998 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:17.539943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:17.540219 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:17.540064 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:17.540219 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:17.540184 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:17.716772 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:17.716689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" event={"ID":"5eeeea03-820a-492e-8003-1f0b99cf2826","Type":"ContainerStarted","Data":"a4fd0368fdeb72686275c5211e7bf80636da5137f68683a70e889ff7f07ea439"} Apr 23 08:51:17.737698 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:17.737640 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7ljrq" podStartSLOduration=3.9085953509999998 podStartE2EDuration="22.73762399s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:50:58.093848509 +0000 UTC m=+3.069156119" lastFinishedPulling="2026-04-23 08:51:16.922877141 +0000 UTC m=+21.898184758" observedRunningTime="2026-04-23 08:51:17.737219814 +0000 UTC m=+22.712527441" watchObservedRunningTime="2026-04-23 08:51:17.73762399 +0000 UTC m=+22.712931616" Apr 23 08:51:18.721911 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:18.721690 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" event={"ID":"d1ed91cc-5386-4f96-93e5-b81a3c676537","Type":"ContainerStarted","Data":"7079c8f82dfc086d8d55ea77b2e71c9216f863e0699071ae1e12bdfd8db0556a"} Apr 23 08:51:19.539685 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:19.539650 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:19.539862 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:19.539750 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:19.539862 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:19.539811 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:19.539999 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:19.539946 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:20.726644 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:20.726440 2575 generic.go:358] "Generic (PLEG): container finished" podID="35e62b41-5dc1-4f18-a2d1-a4c01ace11a3" containerID="04699ff31c2301644c20f7538f9005e56c968aa96a5ac42f86e0cdb9ff68fcf4" exitCode=0 Apr 23 08:51:20.727457 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:20.726520 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c6cqt" event={"ID":"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3","Type":"ContainerDied","Data":"04699ff31c2301644c20f7538f9005e56c968aa96a5ac42f86e0cdb9ff68fcf4"} Apr 23 08:51:20.729879 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:20.729854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" event={"ID":"d1ed91cc-5386-4f96-93e5-b81a3c676537","Type":"ContainerStarted","Data":"0852864b8fcb94b4cf6fa46b8c5bffa0dfc282274271d33e4d4d5dcbac567e61"} Apr 23 08:51:20.730190 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:20.730169 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:51:20.730190 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:20.730197 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:51:20.730190 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:20.730210 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:51:20.744462 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:20.744436 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:51:20.744572 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:20.744510 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:51:20.772196 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:20.772148 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" podStartSLOduration=8.605571654 podStartE2EDuration="25.772134737s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:50:58.093047917 +0000 UTC m=+3.068355528" lastFinishedPulling="2026-04-23 08:51:15.259611003 +0000 UTC m=+20.234918611" observedRunningTime="2026-04-23 08:51:20.770550738 +0000 UTC m=+25.745858363" watchObservedRunningTime="2026-04-23 08:51:20.772134737 +0000 UTC m=+25.747442364" Apr 23 08:51:21.541968 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:21.541947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:21.542061 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:21.541947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:21.542111 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:21.542074 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:21.542171 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:21.542129 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:21.733258 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:21.733229 2575 generic.go:358] "Generic (PLEG): container finished" podID="35e62b41-5dc1-4f18-a2d1-a4c01ace11a3" containerID="1360b068574fce7331b7a3c95bf6c79f9c42d2f40c904ee63e5793815186675c" exitCode=0 Apr 23 08:51:21.733594 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:21.733322 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c6cqt" event={"ID":"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3","Type":"ContainerDied","Data":"1360b068574fce7331b7a3c95bf6c79f9c42d2f40c904ee63e5793815186675c"} Apr 23 08:51:21.870441 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:21.870408 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kl2k9"] Apr 23 08:51:21.870576 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:21.870492 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:21.870576 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:21.870565 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:21.873316 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:21.873292 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9tmnv"] Apr 23 08:51:21.873402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:21.873390 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:21.873517 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:21.873498 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:22.737278 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:22.737248 2575 generic.go:358] "Generic (PLEG): container finished" podID="35e62b41-5dc1-4f18-a2d1-a4c01ace11a3" containerID="baab5c32a92ed2a96eb302180278d577986601b64fddc948b1fc724950fc2439" exitCode=0 Apr 23 08:51:22.737638 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:22.737343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c6cqt" event={"ID":"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3","Type":"ContainerDied","Data":"baab5c32a92ed2a96eb302180278d577986601b64fddc948b1fc724950fc2439"} Apr 23 08:51:23.540098 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:23.540060 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:23.540268 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:23.540102 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:23.540268 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:23.540184 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:23.540409 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:23.540324 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:25.540790 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:25.540764 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:25.541304 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:25.540848 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:25.541304 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:25.540939 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:25.541304 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:25.541034 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:27.539825 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:27.539791 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:27.539825 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:27.539823 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:27.540442 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:27.539962 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:51:27.540442 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:27.540184 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kl2k9" podUID="81dd2f7c-f618-4c84-81fd-ff2be1c08dc3" Apr 23 08:51:27.922760 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:27.922688 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-250.ec2.internal" event="NodeReady" Apr 23 08:51:27.922924 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:27.922820 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 08:51:27.969719 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:27.969689 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qfjzv"] Apr 23 08:51:27.996363 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:27.996323 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qm6xv"] Apr 23 08:51:27.996534 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:27.996514 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:27.998937 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:27.998892 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 08:51:27.998937 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:27.998935 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 08:51:27.999113 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:27.998969 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nrkt9\"" Apr 23 08:51:27.999113 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:27.999018 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 08:51:28.009075 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.009056 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qfjzv"] Apr 23 08:51:28.009075 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.009076 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qm6xv"] Apr 23 08:51:28.009258 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.009176 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.011467 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.011445 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 08:51:28.011576 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.011522 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cjh2d\"" Apr 23 08:51:28.011640 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.011615 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 08:51:28.067806 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.067778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:28.067936 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.067837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66dpj\" (UniqueName: \"kubernetes.io/projected/fbb544b6-122a-4e2a-9835-e970e273e58b-kube-api-access-66dpj\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:28.169095 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.169067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.169095 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.169100 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3c57c70-2bd6-42fa-9ece-35b56e75a778-tmp-dir\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.169278 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.169121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66dpj\" (UniqueName: \"kubernetes.io/projected/fbb544b6-122a-4e2a-9835-e970e273e58b-kube-api-access-66dpj\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:28.169278 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.169225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:28.169348 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.169273 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skdsf\" (UniqueName: \"kubernetes.io/projected/f3c57c70-2bd6-42fa-9ece-35b56e75a778-kube-api-access-skdsf\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.169348 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:28.169308 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:28.169348 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.169310 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c57c70-2bd6-42fa-9ece-35b56e75a778-config-volume\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.169441 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:28.169395 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert podName:fbb544b6-122a-4e2a-9835-e970e273e58b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:28.669362119 +0000 UTC m=+33.644669732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert") pod "ingress-canary-qfjzv" (UID: "fbb544b6-122a-4e2a-9835-e970e273e58b") : secret "canary-serving-cert" not found Apr 23 08:51:28.181339 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.181281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66dpj\" (UniqueName: \"kubernetes.io/projected/fbb544b6-122a-4e2a-9835-e970e273e58b-kube-api-access-66dpj\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:28.270276 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.270245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.270414 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.270281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3c57c70-2bd6-42fa-9ece-35b56e75a778-tmp-dir\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.270414 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.270341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skdsf\" (UniqueName: \"kubernetes.io/projected/f3c57c70-2bd6-42fa-9ece-35b56e75a778-kube-api-access-skdsf\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.270414 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.270365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c57c70-2bd6-42fa-9ece-35b56e75a778-config-volume\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.270414 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:28.270396 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:28.270599 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:28.270461 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls podName:f3c57c70-2bd6-42fa-9ece-35b56e75a778 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:28.770444698 +0000 UTC m=+33.745752307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls") pod "dns-default-qm6xv" (UID: "f3c57c70-2bd6-42fa-9ece-35b56e75a778") : secret "dns-default-metrics-tls" not found Apr 23 08:51:28.270663 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.270600 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3c57c70-2bd6-42fa-9ece-35b56e75a778-tmp-dir\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.270884 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.270864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f3c57c70-2bd6-42fa-9ece-35b56e75a778-config-volume\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.279323 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.279307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skdsf\" (UniqueName: \"kubernetes.io/projected/f3c57c70-2bd6-42fa-9ece-35b56e75a778-kube-api-access-skdsf\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.673021 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.672994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:28.673520 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:28.673097 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:28.673520 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:28.673149 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert podName:fbb544b6-122a-4e2a-9835-e970e273e58b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:29.673134917 +0000 UTC m=+34.648442522 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert") pod "ingress-canary-qfjzv" (UID: "fbb544b6-122a-4e2a-9835-e970e273e58b") : secret "canary-serving-cert" not found Apr 23 08:51:28.752166 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.752138 2575 generic.go:358] "Generic (PLEG): container finished" podID="35e62b41-5dc1-4f18-a2d1-a4c01ace11a3" containerID="35138505d9745d4b0974b1600a970f79a8995529c713fa7face9bc2c7427b25b" exitCode=0 Apr 23 08:51:28.752311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.752186 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c6cqt" event={"ID":"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3","Type":"ContainerDied","Data":"35138505d9745d4b0974b1600a970f79a8995529c713fa7face9bc2c7427b25b"} Apr 23 08:51:28.773278 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:28.773255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:28.773433 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:28.773413 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:28.773507 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:28.773497 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls podName:f3c57c70-2bd6-42fa-9ece-35b56e75a778 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:29.773476306 +0000 UTC m=+34.748783922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls") pod "dns-default-qm6xv" (UID: "f3c57c70-2bd6-42fa-9ece-35b56e75a778") : secret "dns-default-metrics-tls" not found Apr 23 08:51:29.275479 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.275434 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:29.275705 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.275491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjsm\" (UniqueName: \"kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm\") pod \"network-check-target-kl2k9\" (UID: \"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3\") " pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:29.275705 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:29.275599 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:51:29.275705 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:29.275675 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs podName:c32e908b-8a1f-4d28-99e1-dce39209186a nodeName:}" failed. No retries permitted until 2026-04-23 08:52:01.275653723 +0000 UTC m=+66.250961335 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs") pod "network-metrics-daemon-9tmnv" (UID: "c32e908b-8a1f-4d28-99e1-dce39209186a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 08:51:29.275705 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:29.275607 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 08:51:29.275705 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:29.275708 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 08:51:29.275995 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:29.275718 2575 projected.go:194] Error preparing data for projected volume kube-api-access-rvjsm for pod openshift-network-diagnostics/network-check-target-kl2k9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:29.275995 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:29.275759 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm podName:81dd2f7c-f618-4c84-81fd-ff2be1c08dc3 nodeName:}" failed. No retries permitted until 2026-04-23 08:52:01.275748097 +0000 UTC m=+66.251055707 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvjsm" (UniqueName: "kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm") pod "network-check-target-kl2k9" (UID: "81dd2f7c-f618-4c84-81fd-ff2be1c08dc3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 08:51:29.539884 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.539852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:51:29.540083 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.539852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:51:29.545234 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.545202 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:51:29.545234 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.545216 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9mgcs\"" Apr 23 08:51:29.545234 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.545202 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wtr6s\"" Apr 23 08:51:29.545477 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.545213 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:51:29.545477 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.545202 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:51:29.678336 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.678306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:29.678814 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:29.678418 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:29.678814 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:29.678472 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert podName:fbb544b6-122a-4e2a-9835-e970e273e58b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:31.678456286 +0000 UTC m=+36.653763891 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert") pod "ingress-canary-qfjzv" (UID: "fbb544b6-122a-4e2a-9835-e970e273e58b") : secret "canary-serving-cert" not found Apr 23 08:51:29.756971 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.756943 2575 generic.go:358] "Generic (PLEG): container finished" podID="35e62b41-5dc1-4f18-a2d1-a4c01ace11a3" containerID="3e6ef6f9cd5926650f5959ba2726769005ffeb49059f90107d29aa3a46471296" exitCode=0 Apr 23 08:51:29.757129 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.756999 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c6cqt" event={"ID":"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3","Type":"ContainerDied","Data":"3e6ef6f9cd5926650f5959ba2726769005ffeb49059f90107d29aa3a46471296"} Apr 23 08:51:29.779177 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:29.779153 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:29.779298 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:29.779284 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:29.779371 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:29.779357 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls podName:f3c57c70-2bd6-42fa-9ece-35b56e75a778 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:31.779339505 +0000 UTC m=+36.754647115 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls") pod "dns-default-qm6xv" (UID: "f3c57c70-2bd6-42fa-9ece-35b56e75a778") : secret "dns-default-metrics-tls" not found Apr 23 08:51:30.761866 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:30.761681 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c6cqt" event={"ID":"35e62b41-5dc1-4f18-a2d1-a4c01ace11a3","Type":"ContainerStarted","Data":"ea704ecc4f93b1488ea898e0718218ae10c5d59adb365d109f58e371f64764e7"} Apr 23 08:51:30.785990 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:30.785948 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-c6cqt" podStartSLOduration=5.53444055 podStartE2EDuration="35.785881893s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:50:58.097305208 +0000 UTC m=+3.072612826" lastFinishedPulling="2026-04-23 08:51:28.348746562 +0000 UTC m=+33.324054169" observedRunningTime="2026-04-23 08:51:30.785698384 +0000 UTC m=+35.761006011" watchObservedRunningTime="2026-04-23 08:51:30.785881893 +0000 UTC m=+35.761189519" Apr 23 08:51:31.692060 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:31.691973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:31.692223 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:31.692158 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:31.692284 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:31.692233 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert podName:fbb544b6-122a-4e2a-9835-e970e273e58b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:35.692213512 +0000 UTC m=+40.667521117 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert") pod "ingress-canary-qfjzv" (UID: "fbb544b6-122a-4e2a-9835-e970e273e58b") : secret "canary-serving-cert" not found Apr 23 08:51:31.792986 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:31.792957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:31.793351 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:31.793090 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:31.793351 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:31.793151 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls podName:f3c57c70-2bd6-42fa-9ece-35b56e75a778 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:35.793136213 +0000 UTC m=+40.768443818 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls") pod "dns-default-qm6xv" (UID: "f3c57c70-2bd6-42fa-9ece-35b56e75a778") : secret "dns-default-metrics-tls" not found Apr 23 08:51:35.720319 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:35.720276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:35.720697 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:35.720428 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:35.720697 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:35.720491 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert podName:fbb544b6-122a-4e2a-9835-e970e273e58b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:43.720476395 +0000 UTC m=+48.695784000 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert") pod "ingress-canary-qfjzv" (UID: "fbb544b6-122a-4e2a-9835-e970e273e58b") : secret "canary-serving-cert" not found Apr 23 08:51:35.821141 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:35.821104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:35.821305 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:35.821222 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:35.821305 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:35.821269 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls podName:f3c57c70-2bd6-42fa-9ece-35b56e75a778 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:43.82125691 +0000 UTC m=+48.796564515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls") pod "dns-default-qm6xv" (UID: "f3c57c70-2bd6-42fa-9ece-35b56e75a778") : secret "dns-default-metrics-tls" not found Apr 23 08:51:43.771629 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:43.771591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:43.772085 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:43.771698 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:43.772085 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:43.771749 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert podName:fbb544b6-122a-4e2a-9835-e970e273e58b nodeName:}" failed. No retries permitted until 2026-04-23 08:51:59.771735295 +0000 UTC m=+64.747042900 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert") pod "ingress-canary-qfjzv" (UID: "fbb544b6-122a-4e2a-9835-e970e273e58b") : secret "canary-serving-cert" not found Apr 23 08:51:43.872106 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:43.872070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:43.872277 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:43.872179 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:43.872277 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:43.872226 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls podName:f3c57c70-2bd6-42fa-9ece-35b56e75a778 nodeName:}" failed. No retries permitted until 2026-04-23 08:51:59.872213441 +0000 UTC m=+64.847521047 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls") pod "dns-default-qm6xv" (UID: "f3c57c70-2bd6-42fa-9ece-35b56e75a778") : secret "dns-default-metrics-tls" not found Apr 23 08:51:52.747293 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:52.747265 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jtcqz" Apr 23 08:51:59.780483 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:59.780442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:51:59.780857 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:59.780588 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:51:59.780857 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:59.780654 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert podName:fbb544b6-122a-4e2a-9835-e970e273e58b nodeName:}" failed. No retries permitted until 2026-04-23 08:52:31.780636349 +0000 UTC m=+96.755943969 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert") pod "ingress-canary-qfjzv" (UID: "fbb544b6-122a-4e2a-9835-e970e273e58b") : secret "canary-serving-cert" not found Apr 23 08:51:59.881456 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:51:59.881421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:51:59.881615 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:59.881528 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:51:59.881615 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:51:59.881590 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls podName:f3c57c70-2bd6-42fa-9ece-35b56e75a778 nodeName:}" failed. No retries permitted until 2026-04-23 08:52:31.881574921 +0000 UTC m=+96.856882526 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls") pod "dns-default-qm6xv" (UID: "f3c57c70-2bd6-42fa-9ece-35b56e75a778") : secret "dns-default-metrics-tls" not found Apr 23 08:52:01.292021 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:01.291981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:52:01.292395 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:01.292038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjsm\" (UniqueName: \"kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm\") pod \"network-check-target-kl2k9\" (UID: \"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3\") " pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:52:01.295249 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:01.295230 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 08:52:01.295305 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:01.295276 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 08:52:01.302602 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:01.302580 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:52:01.302681 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:01.302651 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs podName:c32e908b-8a1f-4d28-99e1-dce39209186a nodeName:}" failed. No retries permitted until 2026-04-23 08:53:05.302630844 +0000 UTC m=+130.277938449 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs") pod "network-metrics-daemon-9tmnv" (UID: "c32e908b-8a1f-4d28-99e1-dce39209186a") : secret "metrics-daemon-secret" not found Apr 23 08:52:01.305467 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:01.305451 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 08:52:01.315739 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:01.315718 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvjsm\" (UniqueName: \"kubernetes.io/projected/81dd2f7c-f618-4c84-81fd-ff2be1c08dc3-kube-api-access-rvjsm\") pod \"network-check-target-kl2k9\" (UID: \"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3\") " pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:52:01.354225 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:01.354196 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9mgcs\"" Apr 23 08:52:01.362442 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:01.362423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:52:01.482197 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:01.482167 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kl2k9"] Apr 23 08:52:01.485541 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:52:01.485512 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81dd2f7c_f618_4c84_81fd_ff2be1c08dc3.slice/crio-e76ddfefaa121531254cb057bb63d0050a9d7c898e14be42796a884496b456c3 WatchSource:0}: Error finding container e76ddfefaa121531254cb057bb63d0050a9d7c898e14be42796a884496b456c3: Status 404 returned error can't find the container with id e76ddfefaa121531254cb057bb63d0050a9d7c898e14be42796a884496b456c3 Apr 23 08:52:01.818615 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:01.818572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kl2k9" event={"ID":"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3","Type":"ContainerStarted","Data":"e76ddfefaa121531254cb057bb63d0050a9d7c898e14be42796a884496b456c3"} Apr 23 08:52:04.825818 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:04.825783 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kl2k9" event={"ID":"81dd2f7c-f618-4c84-81fd-ff2be1c08dc3","Type":"ContainerStarted","Data":"fabe0727c0f830758dcb6fc2531f6773f5fd13e1dc70586482f570e9ae12c4eb"} Apr 23 08:52:04.826232 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:04.825945 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:52:04.842371 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:04.842322 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kl2k9" podStartSLOduration=67.408429888 podStartE2EDuration="1m9.842306805s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:52:01.487331819 +0000 UTC m=+66.462639425" lastFinishedPulling="2026-04-23 08:52:03.921208734 +0000 UTC m=+68.896516342" observedRunningTime="2026-04-23 08:52:04.841695054 +0000 UTC m=+69.817002679" watchObservedRunningTime="2026-04-23 08:52:04.842306805 +0000 UTC m=+69.817614430" Apr 23 08:52:31.784242 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:31.784209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:52:31.784632 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:31.784316 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 08:52:31.784632 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:31.784400 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert podName:fbb544b6-122a-4e2a-9835-e970e273e58b nodeName:}" failed. No retries permitted until 2026-04-23 08:53:35.784382252 +0000 UTC m=+160.759689859 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert") pod "ingress-canary-qfjzv" (UID: "fbb544b6-122a-4e2a-9835-e970e273e58b") : secret "canary-serving-cert" not found Apr 23 08:52:31.884713 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:31.884679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:52:31.884871 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:31.884799 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 08:52:31.884871 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:31.884849 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls podName:f3c57c70-2bd6-42fa-9ece-35b56e75a778 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:35.884836107 +0000 UTC m=+160.860143713 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls") pod "dns-default-qm6xv" (UID: "f3c57c70-2bd6-42fa-9ece-35b56e75a778") : secret "dns-default-metrics-tls" not found Apr 23 08:52:35.830627 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:35.830592 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kl2k9" Apr 23 08:52:58.672332 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.672293 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6c9bfcb758-sm7fz"] Apr 23 08:52:58.673997 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.673979 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-nsgdp"] Apr 23 08:52:58.674151 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.674134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.675403 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.675385 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.676951 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.676931 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 08:52:58.677137 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.677108 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 08:52:58.677207 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.677150 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 08:52:58.677269 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.677243 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 08:52:58.678266 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.678250 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 08:52:58.678477 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.678459 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 08:52:58.678566 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.678492 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-7zglr\"" Apr 23 08:52:58.678566 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.678530 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 08:52:58.678566 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.678465 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 08:52:58.678761 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.678590 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-m9j6d\"" Apr 23 08:52:58.678814 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.678799 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 08:52:58.678863 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.678813 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 08:52:58.684799 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.684779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 08:52:58.685107 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.685091 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-nsgdp"] Apr 23 08:52:58.688974 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.688954 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6c9bfcb758-sm7fz"] Apr 23 08:52:58.768118 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.768090 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddc13db8-46f8-47be-b720-51cd59fd933a-serving-cert\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.768118 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.768122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ddc13db8-46f8-47be-b720-51cd59fd933a-snapshots\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.768314 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.768139 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddc13db8-46f8-47be-b720-51cd59fd933a-service-ca-bundle\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.768314 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.768161 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-stats-auth\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.768314 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.768185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm2wd\" (UniqueName: \"kubernetes.io/projected/2f8448c9-413a-4f5b-8f76-80f73e69d72f-kube-api-access-fm2wd\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.768314 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.768204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddc13db8-46f8-47be-b720-51cd59fd933a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.768314 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.768272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-default-certificate\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.768314 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.768297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.768314 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.768314 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.768515 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.768330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddc13db8-46f8-47be-b720-51cd59fd933a-tmp\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.768515 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.768414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pbc2\" (UniqueName: \"kubernetes.io/projected/ddc13db8-46f8-47be-b720-51cd59fd933a-kube-api-access-7pbc2\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.769805 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.769780 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmh2t"] Apr 23 08:52:58.771462 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.771449 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmh2t" Apr 23 08:52:58.774671 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.774651 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 08:52:58.775214 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.775185 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:52:58.775303 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.775225 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mbg88\"" Apr 23 08:52:58.775380 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.775356 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz"] Apr 23 08:52:58.776859 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.776832 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dm56t"] Apr 23 08:52:58.777030 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.777013 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:52:58.778396 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.778377 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.779207 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.779192 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 08:52:58.779667 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.779649 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bhmbg\"" Apr 23 08:52:58.779756 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.779736 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 08:52:58.782525 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.782507 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:52:58.782610 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.782580 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 08:52:58.782668 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.782621 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 08:52:58.782668 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.782653 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:52:58.782815 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.782796 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9m2vc\"" Apr 23 08:52:58.782890 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.782850 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 08:52:58.788885 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.788864 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmh2t"] Apr 23 08:52:58.789793 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.789769 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 08:52:58.793075 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.793051 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz"] Apr 23 08:52:58.794124 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.794105 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dm56t"] Apr 23 08:52:58.869538 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddc13db8-46f8-47be-b720-51cd59fd933a-tmp\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.869701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmt8t\" (UniqueName: \"kubernetes.io/projected/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-kube-api-access-kmt8t\") pod \"cluster-samples-operator-6dc5bdb6b4-w4vpz\" (UID: \"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:52:58.869701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pbc2\" (UniqueName: \"kubernetes.io/projected/ddc13db8-46f8-47be-b720-51cd59fd933a-kube-api-access-7pbc2\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.869800 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869709 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-config\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.869800 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869742 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-serving-cert\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.869800 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddc13db8-46f8-47be-b720-51cd59fd933a-serving-cert\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.869966 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w4vpz\" (UID: \"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:52:58.869966 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ddc13db8-46f8-47be-b720-51cd59fd933a-snapshots\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.869966 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddc13db8-46f8-47be-b720-51cd59fd933a-tmp\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.869966 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddc13db8-46f8-47be-b720-51cd59fd933a-service-ca-bundle\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.869966 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-trusted-ca\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.870169 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.869991 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d27t\" (UniqueName: \"kubernetes.io/projected/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-kube-api-access-2d27t\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.870169 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.870028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-stats-auth\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.870169 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.870057 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fm2wd\" (UniqueName: \"kubernetes.io/projected/2f8448c9-413a-4f5b-8f76-80f73e69d72f-kube-api-access-fm2wd\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.870169 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.870082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddc13db8-46f8-47be-b720-51cd59fd933a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.870169 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.870113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt28m\" (UniqueName: \"kubernetes.io/projected/fd81dbd9-73c8-4e7d-86c3-e33a7bae662d-kube-api-access-xt28m\") pod \"volume-data-source-validator-7c6cbb6c87-rmh2t\" (UID: \"fd81dbd9-73c8-4e7d-86c3-e33a7bae662d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmh2t" Apr 23 08:52:58.870169 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.870161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-default-certificate\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.870444 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.870218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.870444 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.870244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.870444 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:58.870350 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:52:58.870444 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:58.870408 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs podName:2f8448c9-413a-4f5b-8f76-80f73e69d72f nodeName:}" failed. No retries permitted until 2026-04-23 08:52:59.370389064 +0000 UTC m=+124.345696744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs") pod "router-default-6c9bfcb758-sm7fz" (UID: "2f8448c9-413a-4f5b-8f76-80f73e69d72f") : secret "router-metrics-certs-default" not found Apr 23 08:52:58.870641 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.870440 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddc13db8-46f8-47be-b720-51cd59fd933a-service-ca-bundle\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.870641 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.870484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ddc13db8-46f8-47be-b720-51cd59fd933a-snapshots\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.870641 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:58.870576 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle podName:2f8448c9-413a-4f5b-8f76-80f73e69d72f nodeName:}" failed. No retries permitted until 2026-04-23 08:52:59.370558033 +0000 UTC m=+124.345865646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle") pod "router-default-6c9bfcb758-sm7fz" (UID: "2f8448c9-413a-4f5b-8f76-80f73e69d72f") : configmap references non-existent config key: service-ca.crt Apr 23 08:52:58.870944 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.870922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ddc13db8-46f8-47be-b720-51cd59fd933a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.872219 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.872200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddc13db8-46f8-47be-b720-51cd59fd933a-serving-cert\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.872339 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.872323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-stats-auth\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.872472 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.872454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-default-certificate\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.879938 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.879882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm2wd\" (UniqueName: \"kubernetes.io/projected/2f8448c9-413a-4f5b-8f76-80f73e69d72f-kube-api-access-fm2wd\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:58.880324 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.880305 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pbc2\" (UniqueName: \"kubernetes.io/projected/ddc13db8-46f8-47be-b720-51cd59fd933a-kube-api-access-7pbc2\") pod \"insights-operator-585dfdc468-nsgdp\" (UID: \"ddc13db8-46f8-47be-b720-51cd59fd933a\") " pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:58.971540 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.971427 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmt8t\" (UniqueName: \"kubernetes.io/projected/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-kube-api-access-kmt8t\") pod \"cluster-samples-operator-6dc5bdb6b4-w4vpz\" (UID: \"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:52:58.971540 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.971513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-config\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.971778 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.971548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-serving-cert\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.971778 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.971578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w4vpz\" (UID: \"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:52:58.971778 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.971602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-trusted-ca\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.971778 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.971625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2d27t\" (UniqueName: \"kubernetes.io/projected/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-kube-api-access-2d27t\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.971778 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.971652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt28m\" (UniqueName: \"kubernetes.io/projected/fd81dbd9-73c8-4e7d-86c3-e33a7bae662d-kube-api-access-xt28m\") pod \"volume-data-source-validator-7c6cbb6c87-rmh2t\" (UID: \"fd81dbd9-73c8-4e7d-86c3-e33a7bae662d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmh2t" Apr 23 08:52:58.971778 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:58.971662 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:52:58.971778 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:58.971756 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls podName:f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6 nodeName:}" failed. No retries permitted until 2026-04-23 08:52:59.471729675 +0000 UTC m=+124.447037360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-w4vpz" (UID: "f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6") : secret "samples-operator-tls" not found Apr 23 08:52:58.972338 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.972319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-config\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.972497 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.972479 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-trusted-ca\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.973764 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.973747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-serving-cert\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.979918 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.979875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt28m\" (UniqueName: \"kubernetes.io/projected/fd81dbd9-73c8-4e7d-86c3-e33a7bae662d-kube-api-access-xt28m\") pod \"volume-data-source-validator-7c6cbb6c87-rmh2t\" (UID: \"fd81dbd9-73c8-4e7d-86c3-e33a7bae662d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmh2t" Apr 23 08:52:58.979918 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.979885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmt8t\" (UniqueName: \"kubernetes.io/projected/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-kube-api-access-kmt8t\") pod \"cluster-samples-operator-6dc5bdb6b4-w4vpz\" (UID: \"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:52:58.980171 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.980155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d27t\" (UniqueName: \"kubernetes.io/projected/b8ea5f2d-a09a-4865-8f65-103aa49ba68c-kube-api-access-2d27t\") pod \"console-operator-9d4b6777b-dm56t\" (UID: \"b8ea5f2d-a09a-4865-8f65-103aa49ba68c\") " pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:58.991098 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:58.991080 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-nsgdp" Apr 23 08:52:59.080640 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:59.080614 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmh2t" Apr 23 08:52:59.096347 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:59.096319 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-nsgdp"] Apr 23 08:52:59.096414 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:59.096377 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:52:59.099322 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:52:59.099299 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddc13db8_46f8_47be_b720_51cd59fd933a.slice/crio-9b878068902fa81963e6c4f40445cbebf7bbe2b07ed3ccae3ccf078393c2b82e WatchSource:0}: Error finding container 9b878068902fa81963e6c4f40445cbebf7bbe2b07ed3ccae3ccf078393c2b82e: Status 404 returned error can't find the container with id 9b878068902fa81963e6c4f40445cbebf7bbe2b07ed3ccae3ccf078393c2b82e Apr 23 08:52:59.197191 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:59.197163 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmh2t"] Apr 23 08:52:59.200815 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:52:59.200781 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd81dbd9_73c8_4e7d_86c3_e33a7bae662d.slice/crio-005f7ca12b4f3bc2229ab2ff53af1597f94578a15af1b278ff2e1cf85888f2dc WatchSource:0}: Error finding container 005f7ca12b4f3bc2229ab2ff53af1597f94578a15af1b278ff2e1cf85888f2dc: Status 404 returned error can't find the container with id 005f7ca12b4f3bc2229ab2ff53af1597f94578a15af1b278ff2e1cf85888f2dc Apr 23 08:52:59.211705 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:59.211684 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-dm56t"] Apr 23 08:52:59.214331 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:52:59.214306 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8ea5f2d_a09a_4865_8f65_103aa49ba68c.slice/crio-1503f6cfe49f751ccfbc4b2db4c919929067e42b2b3c0477e8a5d0b12e5c86aa WatchSource:0}: Error finding container 1503f6cfe49f751ccfbc4b2db4c919929067e42b2b3c0477e8a5d0b12e5c86aa: Status 404 returned error can't find the container with id 1503f6cfe49f751ccfbc4b2db4c919929067e42b2b3c0477e8a5d0b12e5c86aa Apr 23 08:52:59.374485 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:59.374444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:59.374485 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:59.374488 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:52:59.374717 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:59.374609 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle podName:2f8448c9-413a-4f5b-8f76-80f73e69d72f nodeName:}" failed. No retries permitted until 2026-04-23 08:53:00.374590637 +0000 UTC m=+125.349898254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle") pod "router-default-6c9bfcb758-sm7fz" (UID: "2f8448c9-413a-4f5b-8f76-80f73e69d72f") : configmap references non-existent config key: service-ca.crt Apr 23 08:52:59.374717 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:59.374649 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:52:59.374717 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:59.374699 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs podName:2f8448c9-413a-4f5b-8f76-80f73e69d72f nodeName:}" failed. No retries permitted until 2026-04-23 08:53:00.374687373 +0000 UTC m=+125.349994978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs") pod "router-default-6c9bfcb758-sm7fz" (UID: "2f8448c9-413a-4f5b-8f76-80f73e69d72f") : secret "router-metrics-certs-default" not found Apr 23 08:52:59.475156 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:59.475126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w4vpz\" (UID: \"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:52:59.475304 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:59.475271 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:52:59.475366 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:52:59.475335 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls podName:f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:00.475320729 +0000 UTC m=+125.450628334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-w4vpz" (UID: "f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6") : secret "samples-operator-tls" not found Apr 23 08:52:59.938469 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:59.938414 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" event={"ID":"b8ea5f2d-a09a-4865-8f65-103aa49ba68c","Type":"ContainerStarted","Data":"1503f6cfe49f751ccfbc4b2db4c919929067e42b2b3c0477e8a5d0b12e5c86aa"} Apr 23 08:52:59.939817 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:59.939789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nsgdp" event={"ID":"ddc13db8-46f8-47be-b720-51cd59fd933a","Type":"ContainerStarted","Data":"9b878068902fa81963e6c4f40445cbebf7bbe2b07ed3ccae3ccf078393c2b82e"} Apr 23 08:52:59.941411 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:52:59.941382 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmh2t" event={"ID":"fd81dbd9-73c8-4e7d-86c3-e33a7bae662d","Type":"ContainerStarted","Data":"005f7ca12b4f3bc2229ab2ff53af1597f94578a15af1b278ff2e1cf85888f2dc"} Apr 23 08:53:00.381329 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:00.381295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:00.381507 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:00.381342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:00.381507 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:00.381456 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle podName:2f8448c9-413a-4f5b-8f76-80f73e69d72f nodeName:}" failed. No retries permitted until 2026-04-23 08:53:02.381436974 +0000 UTC m=+127.356744586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle") pod "router-default-6c9bfcb758-sm7fz" (UID: "2f8448c9-413a-4f5b-8f76-80f73e69d72f") : configmap references non-existent config key: service-ca.crt Apr 23 08:53:00.381633 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:00.381528 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:53:00.381633 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:00.381599 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs podName:2f8448c9-413a-4f5b-8f76-80f73e69d72f nodeName:}" failed. No retries permitted until 2026-04-23 08:53:02.381581632 +0000 UTC m=+127.356889241 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs") pod "router-default-6c9bfcb758-sm7fz" (UID: "2f8448c9-413a-4f5b-8f76-80f73e69d72f") : secret "router-metrics-certs-default" not found Apr 23 08:53:00.482401 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:00.482358 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w4vpz\" (UID: \"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:53:00.482589 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:00.482493 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:53:00.482589 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:00.482574 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls podName:f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:02.482553936 +0000 UTC m=+127.457861558 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-w4vpz" (UID: "f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6") : secret "samples-operator-tls" not found Apr 23 08:53:00.946120 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:00.946080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmh2t" event={"ID":"fd81dbd9-73c8-4e7d-86c3-e33a7bae662d","Type":"ContainerStarted","Data":"fa77b22aa64abb9951da0dc6b2275f9395681a8d3d45690d7ac7581a4adf6715"} Apr 23 08:53:00.962350 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:00.962258 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmh2t" podStartSLOduration=1.453377222 podStartE2EDuration="2.962244963s" podCreationTimestamp="2026-04-23 08:52:58 +0000 UTC" firstStartedPulling="2026-04-23 08:52:59.202559947 +0000 UTC m=+124.177867552" lastFinishedPulling="2026-04-23 08:53:00.711427688 +0000 UTC m=+125.686735293" observedRunningTime="2026-04-23 08:53:00.961914116 +0000 UTC m=+125.937221736" watchObservedRunningTime="2026-04-23 08:53:00.962244963 +0000 UTC m=+125.937552627" Apr 23 08:53:02.399399 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:02.399305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:02.399399 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:02.399349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:02.399786 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:02.399483 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:53:02.399786 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:02.399504 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle podName:2f8448c9-413a-4f5b-8f76-80f73e69d72f nodeName:}" failed. No retries permitted until 2026-04-23 08:53:06.399481904 +0000 UTC m=+131.374789525 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle") pod "router-default-6c9bfcb758-sm7fz" (UID: "2f8448c9-413a-4f5b-8f76-80f73e69d72f") : configmap references non-existent config key: service-ca.crt Apr 23 08:53:02.399786 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:02.399534 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs podName:2f8448c9-413a-4f5b-8f76-80f73e69d72f nodeName:}" failed. No retries permitted until 2026-04-23 08:53:06.3995245 +0000 UTC m=+131.374832108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs") pod "router-default-6c9bfcb758-sm7fz" (UID: "2f8448c9-413a-4f5b-8f76-80f73e69d72f") : secret "router-metrics-certs-default" not found Apr 23 08:53:02.500738 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:02.500704 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w4vpz\" (UID: \"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:53:02.500883 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:02.500834 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:53:02.500949 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:02.500893 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls podName:f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:06.500876274 +0000 UTC m=+131.476183896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-w4vpz" (UID: "f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6") : secret "samples-operator-tls" not found Apr 23 08:53:02.951916 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:02.951879 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/0.log" Apr 23 08:53:02.952094 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:02.951942 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8ea5f2d-a09a-4865-8f65-103aa49ba68c" containerID="11570db31a9160d1d486ac982988fbd8a5892460ab2b45b6e4d56a6693e240dd" exitCode=255 Apr 23 08:53:02.952094 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:02.951974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" event={"ID":"b8ea5f2d-a09a-4865-8f65-103aa49ba68c","Type":"ContainerDied","Data":"11570db31a9160d1d486ac982988fbd8a5892460ab2b45b6e4d56a6693e240dd"} Apr 23 08:53:02.952264 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:02.952241 2575 scope.go:117] "RemoveContainer" containerID="11570db31a9160d1d486ac982988fbd8a5892460ab2b45b6e4d56a6693e240dd" Apr 23 08:53:02.953344 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:02.953327 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nsgdp" event={"ID":"ddc13db8-46f8-47be-b720-51cd59fd933a","Type":"ContainerStarted","Data":"64f4f97656a42a263546baa1a5c7b7f1357bc7bff9382b8abe40d9b14f0cc77e"} Apr 23 08:53:02.985623 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:02.985581 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-nsgdp" podStartSLOduration=1.968147149 podStartE2EDuration="4.985567981s" podCreationTimestamp="2026-04-23 08:52:58 +0000 UTC" firstStartedPulling="2026-04-23 08:52:59.101674039 +0000 UTC m=+124.076981644" lastFinishedPulling="2026-04-23 08:53:02.119084342 +0000 UTC m=+127.094402476" observedRunningTime="2026-04-23 08:53:02.984801831 +0000 UTC m=+127.960109470" watchObservedRunningTime="2026-04-23 08:53:02.985567981 +0000 UTC m=+127.960875608" Apr 23 08:53:03.956829 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:03.956800 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/1.log" Apr 23 08:53:03.957225 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:03.957194 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/0.log" Apr 23 08:53:03.957283 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:03.957226 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8ea5f2d-a09a-4865-8f65-103aa49ba68c" containerID="3f4a96cac00a733358be9ad0ab30487d93d5a247865d9f529bfdd2f305c6f316" exitCode=255 Apr 23 08:53:03.957339 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:03.957318 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" event={"ID":"b8ea5f2d-a09a-4865-8f65-103aa49ba68c","Type":"ContainerDied","Data":"3f4a96cac00a733358be9ad0ab30487d93d5a247865d9f529bfdd2f305c6f316"} Apr 23 08:53:03.957384 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:03.957362 2575 scope.go:117] "RemoveContainer" containerID="11570db31a9160d1d486ac982988fbd8a5892460ab2b45b6e4d56a6693e240dd" Apr 23 08:53:03.957688 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:03.957669 2575 scope.go:117] "RemoveContainer" containerID="3f4a96cac00a733358be9ad0ab30487d93d5a247865d9f529bfdd2f305c6f316" Apr 23 08:53:03.957865 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:03.957846 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dm56t_openshift-console-operator(b8ea5f2d-a09a-4865-8f65-103aa49ba68c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" podUID="b8ea5f2d-a09a-4865-8f65-103aa49ba68c" Apr 23 08:53:04.335851 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.335814 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4"] Apr 23 08:53:04.338565 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.338534 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4" Apr 23 08:53:04.341317 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.341290 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-xxm7j\"" Apr 23 08:53:04.341423 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.341372 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 08:53:04.341657 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.341637 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 08:53:04.348197 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.348177 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4"] Apr 23 08:53:04.415673 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.415630 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p7xf\" (UniqueName: \"kubernetes.io/projected/9bb76e15-3c64-4595-835e-3e58cb47ed46-kube-api-access-7p7xf\") pod \"migrator-74bb7799d9-smbx4\" (UID: \"9bb76e15-3c64-4595-835e-3e58cb47ed46\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4" Apr 23 08:53:04.516984 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.516945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7p7xf\" (UniqueName: \"kubernetes.io/projected/9bb76e15-3c64-4595-835e-3e58cb47ed46-kube-api-access-7p7xf\") pod \"migrator-74bb7799d9-smbx4\" (UID: \"9bb76e15-3c64-4595-835e-3e58cb47ed46\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4" Apr 23 08:53:04.524661 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.524632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p7xf\" (UniqueName: \"kubernetes.io/projected/9bb76e15-3c64-4595-835e-3e58cb47ed46-kube-api-access-7p7xf\") pod \"migrator-74bb7799d9-smbx4\" (UID: \"9bb76e15-3c64-4595-835e-3e58cb47ed46\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4" Apr 23 08:53:04.647879 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.647789 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4" Apr 23 08:53:04.757991 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.757962 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4"] Apr 23 08:53:04.761176 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:04.761146 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bb76e15_3c64_4595_835e_3e58cb47ed46.slice/crio-ab8af467364925172c588c975fdfb786e9bef1a5a8bdc623355c663a66700198 WatchSource:0}: Error finding container ab8af467364925172c588c975fdfb786e9bef1a5a8bdc623355c663a66700198: Status 404 returned error can't find the container with id ab8af467364925172c588c975fdfb786e9bef1a5a8bdc623355c663a66700198 Apr 23 08:53:04.922783 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.922701 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5lf2l_4e667463-9112-48df-b2c9-8ff9e9415bce/dns-node-resolver/0.log" Apr 23 08:53:04.960358 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.960320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4" event={"ID":"9bb76e15-3c64-4595-835e-3e58cb47ed46","Type":"ContainerStarted","Data":"ab8af467364925172c588c975fdfb786e9bef1a5a8bdc623355c663a66700198"} Apr 23 08:53:04.961611 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.961593 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/1.log" Apr 23 08:53:04.961986 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:04.961964 2575 scope.go:117] "RemoveContainer" containerID="3f4a96cac00a733358be9ad0ab30487d93d5a247865d9f529bfdd2f305c6f316" Apr 23 08:53:04.962152 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:04.962133 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dm56t_openshift-console-operator(b8ea5f2d-a09a-4865-8f65-103aa49ba68c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" podUID="b8ea5f2d-a09a-4865-8f65-103aa49ba68c" Apr 23 08:53:05.322951 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:05.322887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:53:05.323119 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:05.323047 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 08:53:05.323158 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:05.323133 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs podName:c32e908b-8a1f-4d28-99e1-dce39209186a nodeName:}" failed. No retries permitted until 2026-04-23 08:55:07.323117854 +0000 UTC m=+252.298425459 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs") pod "network-metrics-daemon-9tmnv" (UID: "c32e908b-8a1f-4d28-99e1-dce39209186a") : secret "metrics-daemon-secret" not found Apr 23 08:53:05.923132 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:05.923104 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-56m8r_c3c1faf4-8a9e-479e-ac99-dbded210df17/node-ca/0.log" Apr 23 08:53:06.431568 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:06.431539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:06.431568 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:06.431573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:06.432016 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:06.431704 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle podName:2f8448c9-413a-4f5b-8f76-80f73e69d72f nodeName:}" failed. No retries permitted until 2026-04-23 08:53:14.431683854 +0000 UTC m=+139.406991464 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle") pod "router-default-6c9bfcb758-sm7fz" (UID: "2f8448c9-413a-4f5b-8f76-80f73e69d72f") : configmap references non-existent config key: service-ca.crt Apr 23 08:53:06.432016 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:06.431756 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 08:53:06.432016 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:06.431812 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs podName:2f8448c9-413a-4f5b-8f76-80f73e69d72f nodeName:}" failed. No retries permitted until 2026-04-23 08:53:14.431797196 +0000 UTC m=+139.407104801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs") pod "router-default-6c9bfcb758-sm7fz" (UID: "2f8448c9-413a-4f5b-8f76-80f73e69d72f") : secret "router-metrics-certs-default" not found Apr 23 08:53:06.532454 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:06.532380 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w4vpz\" (UID: \"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:53:06.532586 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:06.532491 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 08:53:06.532586 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:06.532538 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls podName:f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6 nodeName:}" failed. No retries permitted until 2026-04-23 08:53:14.532525485 +0000 UTC m=+139.507833090 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-w4vpz" (UID: "f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6") : secret "samples-operator-tls" not found Apr 23 08:53:06.967809 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:06.967730 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4" event={"ID":"9bb76e15-3c64-4595-835e-3e58cb47ed46","Type":"ContainerStarted","Data":"b0b1952a7de1027bf8b2d9fdabf77becd71dfaef8aa26b275b1e4e6360a6fde8"} Apr 23 08:53:06.967809 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:06.967768 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4" event={"ID":"9bb76e15-3c64-4595-835e-3e58cb47ed46","Type":"ContainerStarted","Data":"b8f41aa2af2b05ee1416529211b5c071459d4c8b5b700538ebb2d1410ea0ae8d"} Apr 23 08:53:06.984607 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:06.984053 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smbx4" podStartSLOduration=1.4807115 podStartE2EDuration="2.984037015s" podCreationTimestamp="2026-04-23 08:53:04 +0000 UTC" firstStartedPulling="2026-04-23 08:53:04.763004035 +0000 UTC m=+129.738311640" lastFinishedPulling="2026-04-23 08:53:06.266329547 +0000 UTC m=+131.241637155" observedRunningTime="2026-04-23 08:53:06.983507747 +0000 UTC m=+131.958815374" watchObservedRunningTime="2026-04-23 08:53:06.984037015 +0000 UTC m=+131.959344665" Apr 23 08:53:09.097252 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:09.097204 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:53:09.097252 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:09.097258 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:53:09.097785 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:09.097736 2575 scope.go:117] "RemoveContainer" containerID="3f4a96cac00a733358be9ad0ab30487d93d5a247865d9f529bfdd2f305c6f316" Apr 23 08:53:09.097967 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:09.097945 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-dm56t_openshift-console-operator(b8ea5f2d-a09a-4865-8f65-103aa49ba68c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" podUID="b8ea5f2d-a09a-4865-8f65-103aa49ba68c" Apr 23 08:53:14.501279 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.501216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:14.501279 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.501278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:14.501851 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.501825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f8448c9-413a-4f5b-8f76-80f73e69d72f-service-ca-bundle\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:14.503553 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.503529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f8448c9-413a-4f5b-8f76-80f73e69d72f-metrics-certs\") pod \"router-default-6c9bfcb758-sm7fz\" (UID: \"2f8448c9-413a-4f5b-8f76-80f73e69d72f\") " pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:14.585433 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.585394 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:14.602213 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.602188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w4vpz\" (UID: \"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:53:14.604686 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.604652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w4vpz\" (UID: \"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:53:14.688176 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.688129 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" Apr 23 08:53:14.705622 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.705589 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6c9bfcb758-sm7fz"] Apr 23 08:53:14.708450 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:14.708424 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f8448c9_413a_4f5b_8f76_80f73e69d72f.slice/crio-1760e62f073692b190fc43e30c25eba4c2bb73248d39cabdaa0f8aa1a7d3092d WatchSource:0}: Error finding container 1760e62f073692b190fc43e30c25eba4c2bb73248d39cabdaa0f8aa1a7d3092d: Status 404 returned error can't find the container with id 1760e62f073692b190fc43e30c25eba4c2bb73248d39cabdaa0f8aa1a7d3092d Apr 23 08:53:14.803320 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.803285 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz"] Apr 23 08:53:14.988646 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.988612 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" event={"ID":"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6","Type":"ContainerStarted","Data":"85e58d41dfaec6274cfd6ae200db172d3ae40c4b871b93667de3c8fc586a1701"} Apr 23 08:53:14.989738 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.989715 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" event={"ID":"2f8448c9-413a-4f5b-8f76-80f73e69d72f","Type":"ContainerStarted","Data":"61d209d9ed2e9cc533ac090a7cb49eb8b62bf7ea594fb8db265592f876dfa5c7"} Apr 23 08:53:14.989913 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:14.989745 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" event={"ID":"2f8448c9-413a-4f5b-8f76-80f73e69d72f","Type":"ContainerStarted","Data":"1760e62f073692b190fc43e30c25eba4c2bb73248d39cabdaa0f8aa1a7d3092d"} Apr 23 08:53:15.010742 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:15.010649 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" podStartSLOduration=17.010633666 podStartE2EDuration="17.010633666s" podCreationTimestamp="2026-04-23 08:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:53:15.01037896 +0000 UTC m=+139.985686581" watchObservedRunningTime="2026-04-23 08:53:15.010633666 +0000 UTC m=+139.985941292" Apr 23 08:53:15.585753 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:15.585718 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:15.589727 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:15.589705 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:15.992882 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:15.992844 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:15.994431 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:15.994410 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6c9bfcb758-sm7fz" Apr 23 08:53:16.996156 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:16.996127 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" event={"ID":"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6","Type":"ContainerStarted","Data":"1e1bfe5112ade5c3ead68bb8634e3b12f9c2e88367baa7ce46b0d4b550a8879d"} Apr 23 08:53:16.996528 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:16.996169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" event={"ID":"f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6","Type":"ContainerStarted","Data":"f1919c2c9491b821951fe7fc314e1ba9695f8355569814351cf4393c29ecb0c0"} Apr 23 08:53:17.012985 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:17.012940 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w4vpz" podStartSLOduration=17.209180749 podStartE2EDuration="19.012925934s" podCreationTimestamp="2026-04-23 08:52:58 +0000 UTC" firstStartedPulling="2026-04-23 08:53:14.837658147 +0000 UTC m=+139.812965752" lastFinishedPulling="2026-04-23 08:53:16.641403332 +0000 UTC m=+141.616710937" observedRunningTime="2026-04-23 08:53:17.012158487 +0000 UTC m=+141.987466113" watchObservedRunningTime="2026-04-23 08:53:17.012925934 +0000 UTC m=+141.988233557" Apr 23 08:53:21.540621 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:21.540592 2575 scope.go:117] "RemoveContainer" containerID="3f4a96cac00a733358be9ad0ab30487d93d5a247865d9f529bfdd2f305c6f316" Apr 23 08:53:22.009471 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:22.009442 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/2.log" Apr 23 08:53:22.009789 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:22.009770 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/1.log" Apr 23 08:53:22.009880 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:22.009813 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8ea5f2d-a09a-4865-8f65-103aa49ba68c" containerID="fe3a9226043c3fb7f8157c993a186fbde7d5053ab3b2bedc7c15be382d20b22a" exitCode=255 Apr 23 08:53:22.009880 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:22.009857 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" event={"ID":"b8ea5f2d-a09a-4865-8f65-103aa49ba68c","Type":"ContainerDied","Data":"fe3a9226043c3fb7f8157c993a186fbde7d5053ab3b2bedc7c15be382d20b22a"} Apr 23 08:53:22.010009 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:22.009919 2575 scope.go:117] "RemoveContainer" containerID="3f4a96cac00a733358be9ad0ab30487d93d5a247865d9f529bfdd2f305c6f316" Apr 23 08:53:22.010232 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:22.010216 2575 scope.go:117] "RemoveContainer" containerID="fe3a9226043c3fb7f8157c993a186fbde7d5053ab3b2bedc7c15be382d20b22a" Apr 23 08:53:22.010408 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:22.010374 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dm56t_openshift-console-operator(b8ea5f2d-a09a-4865-8f65-103aa49ba68c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" podUID="b8ea5f2d-a09a-4865-8f65-103aa49ba68c" Apr 23 08:53:23.015510 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:23.015484 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/2.log" Apr 23 08:53:25.440166 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.440127 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-25tqt"] Apr 23 08:53:25.442166 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.442145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-25tqt" Apr 23 08:53:25.444687 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.444669 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-s6n42\"" Apr 23 08:53:25.444861 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.444842 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 08:53:25.445772 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.445756 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 08:53:25.452330 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.452310 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-25tqt"] Apr 23 08:53:25.531488 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.531457 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7d564d9886-5ch26"] Apr 23 08:53:25.533685 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.533661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.536677 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.536331 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 08:53:25.536820 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.536755 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 08:53:25.537130 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.537108 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-48746\"" Apr 23 08:53:25.537822 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.537473 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 08:53:25.541403 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.541381 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 08:53:25.547623 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.547596 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7d564d9886-5ch26"] Apr 23 08:53:25.548871 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.548845 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bcg62"] Apr 23 08:53:25.550785 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.550767 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.553765 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.553738 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 08:53:25.553970 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.553884 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vx78w\"" Apr 23 08:53:25.554312 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.554295 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 08:53:25.563610 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.563588 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bcg62"] Apr 23 08:53:25.585323 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.585300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8133d4a4-92a8-44ef-a085-59ed02873e69-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-25tqt\" (UID: \"8133d4a4-92a8-44ef-a085-59ed02873e69\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-25tqt" Apr 23 08:53:25.585428 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.585332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8133d4a4-92a8-44ef-a085-59ed02873e69-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-25tqt\" (UID: \"8133d4a4-92a8-44ef-a085-59ed02873e69\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-25tqt" Apr 23 08:53:25.685866 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.685832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zrdk\" (UniqueName: \"kubernetes.io/projected/d9c40a04-6569-440b-a7a3-24f158bed60b-kube-api-access-2zrdk\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.685866 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.685868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9c40a04-6569-440b-a7a3-24f158bed60b-ca-trust-extracted\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.686122 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.685894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d9c40a04-6569-440b-a7a3-24f158bed60b-image-registry-private-configuration\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.686122 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9c40a04-6569-440b-a7a3-24f158bed60b-registry-certificates\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.686122 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9c40a04-6569-440b-a7a3-24f158bed60b-trusted-ca\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.686265 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686129 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8133d4a4-92a8-44ef-a085-59ed02873e69-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-25tqt\" (UID: \"8133d4a4-92a8-44ef-a085-59ed02873e69\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-25tqt" Apr 23 08:53:25.686265 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9c40a04-6569-440b-a7a3-24f158bed60b-registry-tls\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.686265 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9c40a04-6569-440b-a7a3-24f158bed60b-installation-pull-secrets\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.686402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8133d4a4-92a8-44ef-a085-59ed02873e69-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-25tqt\" (UID: \"8133d4a4-92a8-44ef-a085-59ed02873e69\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-25tqt" Apr 23 08:53:25.686402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmfxn\" (UniqueName: \"kubernetes.io/projected/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-kube-api-access-rmfxn\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.686402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-crio-socket\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.686402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.686402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.686643 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-data-volume\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.686643 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686481 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9c40a04-6569-440b-a7a3-24f158bed60b-bound-sa-token\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.686768 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.686754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8133d4a4-92a8-44ef-a085-59ed02873e69-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-25tqt\" (UID: \"8133d4a4-92a8-44ef-a085-59ed02873e69\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-25tqt" Apr 23 08:53:25.688628 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.688608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8133d4a4-92a8-44ef-a085-59ed02873e69-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-25tqt\" (UID: \"8133d4a4-92a8-44ef-a085-59ed02873e69\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-25tqt" Apr 23 08:53:25.751148 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.751123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-25tqt" Apr 23 08:53:25.787172 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787146 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9c40a04-6569-440b-a7a3-24f158bed60b-trusted-ca\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.787342 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9c40a04-6569-440b-a7a3-24f158bed60b-registry-tls\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.787342 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9c40a04-6569-440b-a7a3-24f158bed60b-installation-pull-secrets\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.787454 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787369 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmfxn\" (UniqueName: \"kubernetes.io/projected/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-kube-api-access-rmfxn\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.787511 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-crio-socket\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.787564 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.787564 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.787664 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-data-volume\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.787664 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787612 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-crio-socket\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.787664 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9c40a04-6569-440b-a7a3-24f158bed60b-bound-sa-token\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.787805 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zrdk\" (UniqueName: \"kubernetes.io/projected/d9c40a04-6569-440b-a7a3-24f158bed60b-kube-api-access-2zrdk\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.787805 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9c40a04-6569-440b-a7a3-24f158bed60b-ca-trust-extracted\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.787805 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d9c40a04-6569-440b-a7a3-24f158bed60b-image-registry-private-configuration\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.787805 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.787769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9c40a04-6569-440b-a7a3-24f158bed60b-registry-certificates\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.788215 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.788190 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9c40a04-6569-440b-a7a3-24f158bed60b-ca-trust-extracted\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.788378 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.788348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-data-volume\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.788460 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.788373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9c40a04-6569-440b-a7a3-24f158bed60b-trusted-ca\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.788582 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.788558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9c40a04-6569-440b-a7a3-24f158bed60b-registry-certificates\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.788733 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.788709 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.789867 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.789845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9c40a04-6569-440b-a7a3-24f158bed60b-installation-pull-secrets\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.790270 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.790250 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d9c40a04-6569-440b-a7a3-24f158bed60b-image-registry-private-configuration\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.790713 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.790678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.790797 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.790730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9c40a04-6569-440b-a7a3-24f158bed60b-registry-tls\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.797361 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.797321 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9c40a04-6569-440b-a7a3-24f158bed60b-bound-sa-token\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.798038 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.797987 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zrdk\" (UniqueName: \"kubernetes.io/projected/d9c40a04-6569-440b-a7a3-24f158bed60b-kube-api-access-2zrdk\") pod \"image-registry-7d564d9886-5ch26\" (UID: \"d9c40a04-6569-440b-a7a3-24f158bed60b\") " pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.799588 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.799533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmfxn\" (UniqueName: \"kubernetes.io/projected/b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c-kube-api-access-rmfxn\") pod \"insights-runtime-extractor-bcg62\" (UID: \"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c\") " pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.845322 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.845295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:25.862261 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.862230 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bcg62" Apr 23 08:53:25.866289 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.866257 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-25tqt"] Apr 23 08:53:25.869186 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:25.869154 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8133d4a4_92a8_44ef_a085_59ed02873e69.slice/crio-d58ce21dc3d17760b6687e90cae6074ce485ee04ea066629cd1bc80fd1f8501b WatchSource:0}: Error finding container d58ce21dc3d17760b6687e90cae6074ce485ee04ea066629cd1bc80fd1f8501b: Status 404 returned error can't find the container with id d58ce21dc3d17760b6687e90cae6074ce485ee04ea066629cd1bc80fd1f8501b Apr 23 08:53:25.982745 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:25.982708 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7d564d9886-5ch26"] Apr 23 08:53:25.985414 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:25.985384 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c40a04_6569_440b_a7a3_24f158bed60b.slice/crio-7996b011a2fdf2de1e0f6e3a57416c00fcc9831bcb2c05859cdaa11bcd0898f7 WatchSource:0}: Error finding container 7996b011a2fdf2de1e0f6e3a57416c00fcc9831bcb2c05859cdaa11bcd0898f7: Status 404 returned error can't find the container with id 7996b011a2fdf2de1e0f6e3a57416c00fcc9831bcb2c05859cdaa11bcd0898f7 Apr 23 08:53:26.000929 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:26.000892 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bcg62"] Apr 23 08:53:26.003974 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:26.003948 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0b74bfc_0c88_4e90_864f_9d7bdf5ed72c.slice/crio-ce86b9278a80bde53f9edf951321ff816db5eeb922702d7211fa8057a5ca6abf WatchSource:0}: Error finding container ce86b9278a80bde53f9edf951321ff816db5eeb922702d7211fa8057a5ca6abf: Status 404 returned error can't find the container with id ce86b9278a80bde53f9edf951321ff816db5eeb922702d7211fa8057a5ca6abf Apr 23 08:53:26.023323 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:26.022604 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bcg62" event={"ID":"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c","Type":"ContainerStarted","Data":"ce86b9278a80bde53f9edf951321ff816db5eeb922702d7211fa8057a5ca6abf"} Apr 23 08:53:26.024114 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:26.024091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d564d9886-5ch26" event={"ID":"d9c40a04-6569-440b-a7a3-24f158bed60b","Type":"ContainerStarted","Data":"7996b011a2fdf2de1e0f6e3a57416c00fcc9831bcb2c05859cdaa11bcd0898f7"} Apr 23 08:53:26.025231 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:26.025203 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-25tqt" event={"ID":"8133d4a4-92a8-44ef-a085-59ed02873e69","Type":"ContainerStarted","Data":"d58ce21dc3d17760b6687e90cae6074ce485ee04ea066629cd1bc80fd1f8501b"} Apr 23 08:53:27.028835 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:27.028801 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bcg62" event={"ID":"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c","Type":"ContainerStarted","Data":"c4d276441d256f3a9eba76a66874a1c1460c1486df3da8e002ae29e87491aef1"} Apr 23 08:53:27.029944 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:27.029920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7d564d9886-5ch26" event={"ID":"d9c40a04-6569-440b-a7a3-24f158bed60b","Type":"ContainerStarted","Data":"7b35acfe9bafb97a78ce3a38e237a59e42078582fbedb4232222fe8583663691"} Apr 23 08:53:27.030078 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:27.030063 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:27.049833 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:27.049792 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7d564d9886-5ch26" podStartSLOduration=2.049777304 podStartE2EDuration="2.049777304s" podCreationTimestamp="2026-04-23 08:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:53:27.049008786 +0000 UTC m=+152.024316413" watchObservedRunningTime="2026-04-23 08:53:27.049777304 +0000 UTC m=+152.025084930" Apr 23 08:53:28.038160 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:28.038072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bcg62" event={"ID":"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c","Type":"ContainerStarted","Data":"9fb64eba105a984b1bf72f0e5713a28fa2a0803c733517ec208fdd9ba8131c82"} Apr 23 08:53:28.039290 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:28.039266 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-25tqt" event={"ID":"8133d4a4-92a8-44ef-a085-59ed02873e69","Type":"ContainerStarted","Data":"aa407f592869cb9422df8f62754298819378d06669f9ab8f69c9111f76e277c2"} Apr 23 08:53:28.058381 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:28.058337 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-25tqt" podStartSLOduration=1.158469519 podStartE2EDuration="3.058325943s" podCreationTimestamp="2026-04-23 08:53:25 +0000 UTC" firstStartedPulling="2026-04-23 08:53:25.871402726 +0000 UTC m=+150.846710339" lastFinishedPulling="2026-04-23 08:53:27.771259149 +0000 UTC m=+152.746566763" observedRunningTime="2026-04-23 08:53:28.058175926 +0000 UTC m=+153.033483553" watchObservedRunningTime="2026-04-23 08:53:28.058325943 +0000 UTC m=+153.033633575" Apr 23 08:53:29.097079 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:29.097048 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:53:29.097079 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:29.097078 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:53:29.097393 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:29.097380 2575 scope.go:117] "RemoveContainer" containerID="fe3a9226043c3fb7f8157c993a186fbde7d5053ab3b2bedc7c15be382d20b22a" Apr 23 08:53:29.097551 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:29.097537 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dm56t_openshift-console-operator(b8ea5f2d-a09a-4865-8f65-103aa49ba68c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" podUID="b8ea5f2d-a09a-4865-8f65-103aa49ba68c" Apr 23 08:53:30.046513 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:30.046480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bcg62" event={"ID":"b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c","Type":"ContainerStarted","Data":"73ba1df8de7e2e3af7d7d60158c18e2e9f6af6eda844e139a2e80327f4ce2689"} Apr 23 08:53:30.064576 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:30.064534 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bcg62" podStartSLOduration=2.105424026 podStartE2EDuration="5.064519089s" podCreationTimestamp="2026-04-23 08:53:25 +0000 UTC" firstStartedPulling="2026-04-23 08:53:26.042841298 +0000 UTC m=+151.018148903" lastFinishedPulling="2026-04-23 08:53:29.001936361 +0000 UTC m=+153.977243966" observedRunningTime="2026-04-23 08:53:30.063625045 +0000 UTC m=+155.038932672" watchObservedRunningTime="2026-04-23 08:53:30.064519089 +0000 UTC m=+155.039826758" Apr 23 08:53:31.005891 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:31.005854 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qfjzv" podUID="fbb544b6-122a-4e2a-9835-e970e273e58b" Apr 23 08:53:31.017018 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:31.016988 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-qm6xv" podUID="f3c57c70-2bd6-42fa-9ece-35b56e75a778" Apr 23 08:53:31.048488 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:31.048459 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:53:32.555800 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:32.555763 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9tmnv" podUID="c32e908b-8a1f-4d28-99e1-dce39209186a" Apr 23 08:53:32.869462 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.869382 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-x9b4d"] Apr 23 08:53:32.877079 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.877055 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:32.879161 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.879118 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 08:53:32.879349 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.879305 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 08:53:32.879524 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.879503 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 08:53:32.879628 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.879569 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 08:53:32.879628 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.879611 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x268v\"" Apr 23 08:53:32.880560 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.880540 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 08:53:32.880689 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.880670 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 08:53:32.937856 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.937827 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzdtl\" (UniqueName: \"kubernetes.io/projected/1d8f119e-e62b-482e-b8b2-d61c14023d7f-kube-api-access-mzdtl\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:32.937988 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.937885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d8f119e-e62b-482e-b8b2-d61c14023d7f-sys\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:32.938028 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.937991 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d8f119e-e62b-482e-b8b2-d61c14023d7f-metrics-client-ca\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:32.938061 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.938025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-accelerators-collector-config\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:32.938061 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.938051 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-wtmp\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:32.938119 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.938074 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-textfile\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:32.938147 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.938138 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-tls\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:32.938182 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.938158 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d8f119e-e62b-482e-b8b2-d61c14023d7f-root\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:32.938182 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:32.938173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-wtmp\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039233 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-textfile\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039233 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-tls\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039233 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d8f119e-e62b-482e-b8b2-d61c14023d7f-root\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039233 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzdtl\" (UniqueName: \"kubernetes.io/projected/1d8f119e-e62b-482e-b8b2-d61c14023d7f-kube-api-access-mzdtl\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-wtmp\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d8f119e-e62b-482e-b8b2-d61c14023d7f-root\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d8f119e-e62b-482e-b8b2-d61c14023d7f-sys\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d8f119e-e62b-482e-b8b2-d61c14023d7f-metrics-client-ca\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039402 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:33.039379 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 08:53:33.039402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-accelerators-collector-config\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039702 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d8f119e-e62b-482e-b8b2-d61c14023d7f-sys\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039702 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:33.039456 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-tls podName:1d8f119e-e62b-482e-b8b2-d61c14023d7f nodeName:}" failed. No retries permitted until 2026-04-23 08:53:33.53943358 +0000 UTC m=+158.514741187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-tls") pod "node-exporter-x9b4d" (UID: "1d8f119e-e62b-482e-b8b2-d61c14023d7f") : secret "node-exporter-tls" not found Apr 23 08:53:33.039702 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-textfile\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.039993 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d8f119e-e62b-482e-b8b2-d61c14023d7f-metrics-client-ca\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.040038 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.039975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-accelerators-collector-config\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.041563 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.041538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.047678 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.047653 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzdtl\" (UniqueName: \"kubernetes.io/projected/1d8f119e-e62b-482e-b8b2-d61c14023d7f-kube-api-access-mzdtl\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.543124 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.543087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-tls\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.545378 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.545352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d8f119e-e62b-482e-b8b2-d61c14023d7f-node-exporter-tls\") pod \"node-exporter-x9b4d\" (UID: \"1d8f119e-e62b-482e-b8b2-d61c14023d7f\") " pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.786242 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.786204 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-x9b4d" Apr 23 08:53:33.794656 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:33.794592 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8f119e_e62b_482e_b8b2_d61c14023d7f.slice/crio-36ca7fd0a5ed9fa8ac6cc5f616394d6fbd084add04aecfaef800d33a0f668069 WatchSource:0}: Error finding container 36ca7fd0a5ed9fa8ac6cc5f616394d6fbd084add04aecfaef800d33a0f668069: Status 404 returned error can't find the container with id 36ca7fd0a5ed9fa8ac6cc5f616394d6fbd084add04aecfaef800d33a0f668069 Apr 23 08:53:33.959256 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.959222 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:53:33.962113 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.962098 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:33.964575 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.964554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 08:53:33.964575 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.964568 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 08:53:33.964739 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.964713 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 08:53:33.964859 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.964836 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 08:53:33.964992 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.964951 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 08:53:33.964992 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.964987 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dswlz\"" Apr 23 08:53:33.965102 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.964987 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 08:53:33.965102 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.965022 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 08:53:33.965102 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.965045 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 08:53:33.965349 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.965333 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 08:53:33.976135 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:33.976116 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:53:34.046857 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.046797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.046857 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.046841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-out\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.047010 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.046858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.047010 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.046874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.047010 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.046976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.047010 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.046996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.047133 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.047016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.047133 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.047034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-web-config\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.047133 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.047074 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.047240 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.047135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.047240 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.047158 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.047240 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.047180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.047240 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.047198 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7frv\" (UniqueName: \"kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-kube-api-access-p7frv\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.057643 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.057617 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x9b4d" event={"ID":"1d8f119e-e62b-482e-b8b2-d61c14023d7f","Type":"ContainerStarted","Data":"36ca7fd0a5ed9fa8ac6cc5f616394d6fbd084add04aecfaef800d33a0f668069"} Apr 23 08:53:34.147505 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.147476 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.147505 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.147511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-web-config\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.147712 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.147531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.147712 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.147577 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.147712 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.147601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.147712 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.147630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.147712 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.147655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7frv\" (UniqueName: \"kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-kube-api-access-p7frv\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.147712 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.147680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.148209 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:34.147779 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 23 08:53:34.148209 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:34.147832 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-main-tls podName:1a4d8dea-d968-447a-ac8c-b695ed740c1a nodeName:}" failed. No retries permitted until 2026-04-23 08:53:34.647815524 +0000 UTC m=+159.623123132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a") : secret "alertmanager-main-tls" not found Apr 23 08:53:34.148209 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.148176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-out\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.148209 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.148207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.148422 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.148232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.148422 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.148311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.148422 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.148337 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.148422 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.148399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.148655 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:34.148433 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-trusted-ca-bundle podName:1a4d8dea-d968-447a-ac8c-b695ed740c1a nodeName:}" failed. No retries permitted until 2026-04-23 08:53:34.648414944 +0000 UTC m=+159.623722551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a") : configmap references non-existent config key: ca-bundle.crt Apr 23 08:53:34.148805 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.148771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.150599 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.150577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.150840 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.150803 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-out\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.151098 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.151074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.151171 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.151082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.151211 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.151179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.151211 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.151197 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-web-config\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.151298 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.151276 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-volume\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.152046 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.152030 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.156811 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.156794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7frv\" (UniqueName: \"kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-kube-api-access-p7frv\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.652133 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.652102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.652249 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.652166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.652859 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.652832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.654221 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.654202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.871301 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.871267 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:53:34.999258 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:34.999224 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:53:35.002388 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:35.002364 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a4d8dea_d968_447a_ac8c_b695ed740c1a.slice/crio-fccc3093410bebe059a0d3f2b42a84515585767fdc9691cac496228a197f4358 WatchSource:0}: Error finding container fccc3093410bebe059a0d3f2b42a84515585767fdc9691cac496228a197f4358: Status 404 returned error can't find the container with id fccc3093410bebe059a0d3f2b42a84515585767fdc9691cac496228a197f4358 Apr 23 08:53:35.061584 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:35.061547 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerStarted","Data":"fccc3093410bebe059a0d3f2b42a84515585767fdc9691cac496228a197f4358"} Apr 23 08:53:35.062821 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:35.062795 2575 generic.go:358] "Generic (PLEG): container finished" podID="1d8f119e-e62b-482e-b8b2-d61c14023d7f" containerID="6c23f0dd191006a6cab6d7a4e8d26ed249785c18b0d2b2de1c37b24d33d9512a" exitCode=0 Apr 23 08:53:35.062943 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:35.062831 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x9b4d" event={"ID":"1d8f119e-e62b-482e-b8b2-d61c14023d7f","Type":"ContainerDied","Data":"6c23f0dd191006a6cab6d7a4e8d26ed249785c18b0d2b2de1c37b24d33d9512a"} Apr 23 08:53:35.862792 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:35.862762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:53:35.865180 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:35.865151 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fbb544b6-122a-4e2a-9835-e970e273e58b-cert\") pod \"ingress-canary-qfjzv\" (UID: \"fbb544b6-122a-4e2a-9835-e970e273e58b\") " pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:53:35.963804 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:35.963769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:53:35.966394 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:35.966371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3c57c70-2bd6-42fa-9ece-35b56e75a778-metrics-tls\") pod \"dns-default-qm6xv\" (UID: \"f3c57c70-2bd6-42fa-9ece-35b56e75a778\") " pod="openshift-dns/dns-default-qm6xv" Apr 23 08:53:36.067867 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:36.067831 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x9b4d" event={"ID":"1d8f119e-e62b-482e-b8b2-d61c14023d7f","Type":"ContainerStarted","Data":"618c3ea77041ff661262dab85476d9c81441c6173d7a03fce29a5a36523d7dd3"} Apr 23 08:53:36.067867 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:36.067868 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x9b4d" event={"ID":"1d8f119e-e62b-482e-b8b2-d61c14023d7f","Type":"ContainerStarted","Data":"9ae6a996ff3b99f5698087d4b1b434bbd2953e1291df97fbadbf84250ee5364d"} Apr 23 08:53:36.087042 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:36.086990 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-x9b4d" podStartSLOduration=3.408046809 podStartE2EDuration="4.086975913s" podCreationTimestamp="2026-04-23 08:53:32 +0000 UTC" firstStartedPulling="2026-04-23 08:53:33.796414998 +0000 UTC m=+158.771722609" lastFinishedPulling="2026-04-23 08:53:34.475344108 +0000 UTC m=+159.450651713" observedRunningTime="2026-04-23 08:53:36.085646589 +0000 UTC m=+161.060954217" watchObservedRunningTime="2026-04-23 08:53:36.086975913 +0000 UTC m=+161.062283540" Apr 23 08:53:36.151851 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:36.151780 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nrkt9\"" Apr 23 08:53:36.159753 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:36.159730 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qfjzv" Apr 23 08:53:36.269263 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:36.269235 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qfjzv"] Apr 23 08:53:36.272066 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:36.272043 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbb544b6_122a_4e2a_9835_e970e273e58b.slice/crio-8bc88e13484e90a39a4e5053a1748faa5475a0ee7e39bebae094ee8a9327d699 WatchSource:0}: Error finding container 8bc88e13484e90a39a4e5053a1748faa5475a0ee7e39bebae094ee8a9327d699: Status 404 returned error can't find the container with id 8bc88e13484e90a39a4e5053a1748faa5475a0ee7e39bebae094ee8a9327d699 Apr 23 08:53:37.071943 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.071890 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qfjzv" event={"ID":"fbb544b6-122a-4e2a-9835-e970e273e58b","Type":"ContainerStarted","Data":"8bc88e13484e90a39a4e5053a1748faa5475a0ee7e39bebae094ee8a9327d699"} Apr 23 08:53:37.073393 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.073362 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerID="3e5423673e4c0a8ee97ce1f3772a76bc5203f01c4daea47584672a679a49ae0e" exitCode=0 Apr 23 08:53:37.073515 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.073427 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerDied","Data":"3e5423673e4c0a8ee97ce1f3772a76bc5203f01c4daea47584672a679a49ae0e"} Apr 23 08:53:37.262421 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.262386 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6597794cf6-jxhvl"] Apr 23 08:53:37.266249 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.266226 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.268846 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.268823 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 08:53:37.268971 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.268823 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 08:53:37.268971 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.268961 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-eern6jmmk7gl3\"" Apr 23 08:53:37.269095 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.269003 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 08:53:37.269095 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.269034 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 08:53:37.269095 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.269003 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-jpbx8\"" Apr 23 08:53:37.277565 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.277525 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6597794cf6-jxhvl"] Apr 23 08:53:37.376022 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.375926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/222c8c75-9350-46dc-9088-28d00d4e6b2a-metrics-server-audit-profiles\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.376022 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.375976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222c8c75-9350-46dc-9088-28d00d4e6b2a-client-ca-bundle\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.376207 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.376082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/222c8c75-9350-46dc-9088-28d00d4e6b2a-secret-metrics-server-client-certs\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.376207 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.376139 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/222c8c75-9350-46dc-9088-28d00d4e6b2a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.376207 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.376162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/222c8c75-9350-46dc-9088-28d00d4e6b2a-audit-log\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.376207 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.376185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9vxd\" (UniqueName: \"kubernetes.io/projected/222c8c75-9350-46dc-9088-28d00d4e6b2a-kube-api-access-z9vxd\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.376356 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.376292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/222c8c75-9350-46dc-9088-28d00d4e6b2a-secret-metrics-server-tls\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.477402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.477364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/222c8c75-9350-46dc-9088-28d00d4e6b2a-secret-metrics-server-tls\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.477588 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.477423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/222c8c75-9350-46dc-9088-28d00d4e6b2a-metrics-server-audit-profiles\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.477588 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.477452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222c8c75-9350-46dc-9088-28d00d4e6b2a-client-ca-bundle\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.477588 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.477515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/222c8c75-9350-46dc-9088-28d00d4e6b2a-secret-metrics-server-client-certs\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.477588 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.477564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/222c8c75-9350-46dc-9088-28d00d4e6b2a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.477794 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.477589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/222c8c75-9350-46dc-9088-28d00d4e6b2a-audit-log\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.477794 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.477621 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9vxd\" (UniqueName: \"kubernetes.io/projected/222c8c75-9350-46dc-9088-28d00d4e6b2a-kube-api-access-z9vxd\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.478983 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.478953 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/222c8c75-9350-46dc-9088-28d00d4e6b2a-audit-log\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.479205 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.479184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/222c8c75-9350-46dc-9088-28d00d4e6b2a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.479652 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.479625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/222c8c75-9350-46dc-9088-28d00d4e6b2a-metrics-server-audit-profiles\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.480454 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.480431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/222c8c75-9350-46dc-9088-28d00d4e6b2a-client-ca-bundle\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.480551 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.480440 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/222c8c75-9350-46dc-9088-28d00d4e6b2a-secret-metrics-server-tls\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.480602 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.480555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/222c8c75-9350-46dc-9088-28d00d4e6b2a-secret-metrics-server-client-certs\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.485349 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.485324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9vxd\" (UniqueName: \"kubernetes.io/projected/222c8c75-9350-46dc-9088-28d00d4e6b2a-kube-api-access-z9vxd\") pod \"metrics-server-6597794cf6-jxhvl\" (UID: \"222c8c75-9350-46dc-9088-28d00d4e6b2a\") " pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:37.580216 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:37.580131 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:38.044224 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.044198 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk"] Apr 23 08:53:38.047354 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.047332 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.050577 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.050319 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 08:53:38.050577 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.050390 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 08:53:38.050577 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.050425 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 08:53:38.050577 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.050451 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 08:53:38.050577 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.050505 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 08:53:38.050577 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.050576 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-ld7ng\"" Apr 23 08:53:38.056553 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.056531 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 08:53:38.060540 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.060479 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk"] Apr 23 08:53:38.085446 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.085381 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plqsj\" (UniqueName: \"kubernetes.io/projected/c97cf881-9c57-4f1a-a261-ae0ff786ad82-kube-api-access-plqsj\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.085446 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.085447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c97cf881-9c57-4f1a-a261-ae0ff786ad82-metrics-client-ca\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.086283 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.085535 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-federate-client-tls\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.086283 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.085593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-telemeter-client-tls\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.086283 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.085675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97cf881-9c57-4f1a-a261-ae0ff786ad82-serving-certs-ca-bundle\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.086283 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.085757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.086283 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.085878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97cf881-9c57-4f1a-a261-ae0ff786ad82-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.086283 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.085959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-secret-telemeter-client\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.151013 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.150986 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6597794cf6-jxhvl"] Apr 23 08:53:38.152596 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:38.152563 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod222c8c75_9350_46dc_9088_28d00d4e6b2a.slice/crio-1546a851fd550b013fc6c0a62d3f7caa0b7efc9ae1c3674e006a44811cd76083 WatchSource:0}: Error finding container 1546a851fd550b013fc6c0a62d3f7caa0b7efc9ae1c3674e006a44811cd76083: Status 404 returned error can't find the container with id 1546a851fd550b013fc6c0a62d3f7caa0b7efc9ae1c3674e006a44811cd76083 Apr 23 08:53:38.187104 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.187073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plqsj\" (UniqueName: \"kubernetes.io/projected/c97cf881-9c57-4f1a-a261-ae0ff786ad82-kube-api-access-plqsj\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.187247 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.187125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c97cf881-9c57-4f1a-a261-ae0ff786ad82-metrics-client-ca\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.187247 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.187156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-federate-client-tls\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.187247 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.187185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-telemeter-client-tls\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.187247 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.187237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97cf881-9c57-4f1a-a261-ae0ff786ad82-serving-certs-ca-bundle\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.187424 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.187288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.187424 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.187336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97cf881-9c57-4f1a-a261-ae0ff786ad82-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.187488 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.187461 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-secret-telemeter-client\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.188008 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.187961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c97cf881-9c57-4f1a-a261-ae0ff786ad82-metrics-client-ca\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.188129 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.188082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97cf881-9c57-4f1a-a261-ae0ff786ad82-serving-certs-ca-bundle\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.188452 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.188408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97cf881-9c57-4f1a-a261-ae0ff786ad82-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.191168 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.190769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.191168 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.190921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-secret-telemeter-client\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.191168 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.191085 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-telemeter-client-tls\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.191628 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.191608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c97cf881-9c57-4f1a-a261-ae0ff786ad82-federate-client-tls\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.195125 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.195106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plqsj\" (UniqueName: \"kubernetes.io/projected/c97cf881-9c57-4f1a-a261-ae0ff786ad82-kube-api-access-plqsj\") pod \"telemeter-client-79b6cb47bb-xxqfk\" (UID: \"c97cf881-9c57-4f1a-a261-ae0ff786ad82\") " pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.359069 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.358984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" Apr 23 08:53:38.713211 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:38.713181 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk"] Apr 23 08:53:38.718204 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:38.718180 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc97cf881_9c57_4f1a_a261_ae0ff786ad82.slice/crio-f5f5c8969eb42f4b84aa54ab22a0c1ab0e2e003c5b3fa7e8320fdecc9c1cb4a9 WatchSource:0}: Error finding container f5f5c8969eb42f4b84aa54ab22a0c1ab0e2e003c5b3fa7e8320fdecc9c1cb4a9: Status 404 returned error can't find the container with id f5f5c8969eb42f4b84aa54ab22a0c1ab0e2e003c5b3fa7e8320fdecc9c1cb4a9 Apr 23 08:53:39.081506 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:39.081474 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qfjzv" event={"ID":"fbb544b6-122a-4e2a-9835-e970e273e58b","Type":"ContainerStarted","Data":"0a1787800184913c4f11353f84c2266d87b3d35897ca42613e66b1f62838e036"} Apr 23 08:53:39.082720 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:39.082695 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" event={"ID":"222c8c75-9350-46dc-9088-28d00d4e6b2a","Type":"ContainerStarted","Data":"1546a851fd550b013fc6c0a62d3f7caa0b7efc9ae1c3674e006a44811cd76083"} Apr 23 08:53:39.085576 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:39.085545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerStarted","Data":"6b4f36016c9f73518e22845b7d95b18d00c75cac452a366e3c8a886081f6a275"} Apr 23 08:53:39.085576 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:39.085578 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerStarted","Data":"a58592cbea607021b0aa90332ddee10de2c5261eb8d9188a9ae93dfe72b08173"} Apr 23 08:53:39.086015 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:39.085588 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerStarted","Data":"4362ffea44bf09022470b26fe2af27e9e77ee52465b3c2e674f662a360369abd"} Apr 23 08:53:39.086015 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:39.085596 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerStarted","Data":"e140eb3fb83d586efaa772c49bceb09f2377cd3bcc65bdf82749d8862f47ce8f"} Apr 23 08:53:39.086015 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:39.085605 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerStarted","Data":"6878bdf6f3ffcb2a45a1ac6a950ee9888f47ef02fdb55f88290efd0f2c8a0ced"} Apr 23 08:53:39.086672 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:39.086641 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" event={"ID":"c97cf881-9c57-4f1a-a261-ae0ff786ad82","Type":"ContainerStarted","Data":"f5f5c8969eb42f4b84aa54ab22a0c1ab0e2e003c5b3fa7e8320fdecc9c1cb4a9"} Apr 23 08:53:39.097553 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:39.097503 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qfjzv" podStartSLOduration=130.299136166 podStartE2EDuration="2m12.09748875s" podCreationTimestamp="2026-04-23 08:51:27 +0000 UTC" firstStartedPulling="2026-04-23 08:53:36.273920312 +0000 UTC m=+161.249227918" lastFinishedPulling="2026-04-23 08:53:38.072272884 +0000 UTC m=+163.047580502" observedRunningTime="2026-04-23 08:53:39.09627354 +0000 UTC m=+164.071581170" watchObservedRunningTime="2026-04-23 08:53:39.09748875 +0000 UTC m=+164.072796377" Apr 23 08:53:40.091671 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:40.091639 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" event={"ID":"222c8c75-9350-46dc-9088-28d00d4e6b2a","Type":"ContainerStarted","Data":"5cfb366a3a396ce4209839e5724a7e0a6ad0aa1c010358b9e38af6255d8668ef"} Apr 23 08:53:40.095027 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:40.094997 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerStarted","Data":"7dc289cce5cc28145605bb43811296e5a5b6fce7b6e32bdcdf13e8dc653bc576"} Apr 23 08:53:40.109696 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:40.109651 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" podStartSLOduration=1.645463991 podStartE2EDuration="3.109636619s" podCreationTimestamp="2026-04-23 08:53:37 +0000 UTC" firstStartedPulling="2026-04-23 08:53:38.154590472 +0000 UTC m=+163.129898077" lastFinishedPulling="2026-04-23 08:53:39.618763096 +0000 UTC m=+164.594070705" observedRunningTime="2026-04-23 08:53:40.108400923 +0000 UTC m=+165.083708550" watchObservedRunningTime="2026-04-23 08:53:40.109636619 +0000 UTC m=+165.084944244" Apr 23 08:53:40.137846 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:40.137792 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.122082063 podStartE2EDuration="7.137774929s" podCreationTimestamp="2026-04-23 08:53:33 +0000 UTC" firstStartedPulling="2026-04-23 08:53:35.004263531 +0000 UTC m=+159.979571140" lastFinishedPulling="2026-04-23 08:53:40.019956389 +0000 UTC m=+164.995264006" observedRunningTime="2026-04-23 08:53:40.134628971 +0000 UTC m=+165.109936598" watchObservedRunningTime="2026-04-23 08:53:40.137774929 +0000 UTC m=+165.113082551" Apr 23 08:53:40.540380 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:40.540353 2575 scope.go:117] "RemoveContainer" containerID="fe3a9226043c3fb7f8157c993a186fbde7d5053ab3b2bedc7c15be382d20b22a" Apr 23 08:53:40.540603 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:53:40.540573 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-dm56t_openshift-console-operator(b8ea5f2d-a09a-4865-8f65-103aa49ba68c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" podUID="b8ea5f2d-a09a-4865-8f65-103aa49ba68c" Apr 23 08:53:41.099310 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:41.099213 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" event={"ID":"c97cf881-9c57-4f1a-a261-ae0ff786ad82","Type":"ContainerStarted","Data":"0aa6fb0e6aad9e6881d2c0c52f58740c6e996d380442414573563cc04bd933ff"} Apr 23 08:53:41.099310 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:41.099252 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" event={"ID":"c97cf881-9c57-4f1a-a261-ae0ff786ad82","Type":"ContainerStarted","Data":"1102a1aee12b37c92640311044068a74c2386b7674566d116d4fea058b7bae07"} Apr 23 08:53:41.099310 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:41.099264 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" event={"ID":"c97cf881-9c57-4f1a-a261-ae0ff786ad82","Type":"ContainerStarted","Data":"bbf326b949aa3b81941c4464a9ef4ec547947d054ad27c80f7080664064ca0fa"} Apr 23 08:53:41.122483 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:41.122415 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-79b6cb47bb-xxqfk" podStartSLOduration=1.071265222 podStartE2EDuration="3.122387603s" podCreationTimestamp="2026-04-23 08:53:38 +0000 UTC" firstStartedPulling="2026-04-23 08:53:38.720411945 +0000 UTC m=+163.695719553" lastFinishedPulling="2026-04-23 08:53:40.771534317 +0000 UTC m=+165.746841934" observedRunningTime="2026-04-23 08:53:41.121923872 +0000 UTC m=+166.097231497" watchObservedRunningTime="2026-04-23 08:53:41.122387603 +0000 UTC m=+166.097695231" Apr 23 08:53:41.539679 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:41.539644 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qm6xv" Apr 23 08:53:41.542574 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:41.542544 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cjh2d\"" Apr 23 08:53:41.550861 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:41.550844 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qm6xv" Apr 23 08:53:41.660716 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:41.660688 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qm6xv"] Apr 23 08:53:41.663502 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:41.663477 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3c57c70_2bd6_42fa_9ece_35b56e75a778.slice/crio-457c71784a75baf3de18b4f7d4a80d2c2eb89553b659570842810b9cff54e79b WatchSource:0}: Error finding container 457c71784a75baf3de18b4f7d4a80d2c2eb89553b659570842810b9cff54e79b: Status 404 returned error can't find the container with id 457c71784a75baf3de18b4f7d4a80d2c2eb89553b659570842810b9cff54e79b Apr 23 08:53:42.104614 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:42.104580 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qm6xv" event={"ID":"f3c57c70-2bd6-42fa-9ece-35b56e75a778","Type":"ContainerStarted","Data":"457c71784a75baf3de18b4f7d4a80d2c2eb89553b659570842810b9cff54e79b"} Apr 23 08:53:44.112421 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:44.112333 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qm6xv" event={"ID":"f3c57c70-2bd6-42fa-9ece-35b56e75a778","Type":"ContainerStarted","Data":"cc2efac774024994f4aea53aa0703e90b860586d444bf219c38dda4171fba436"} Apr 23 08:53:44.112421 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:44.112368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qm6xv" event={"ID":"f3c57c70-2bd6-42fa-9ece-35b56e75a778","Type":"ContainerStarted","Data":"293123bccb80440fe856a6ba5ebb7fc5982f530b359b99647e9df4a30f8b022a"} Apr 23 08:53:44.112820 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:44.112480 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qm6xv" Apr 23 08:53:44.132589 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:44.132538 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qm6xv" podStartSLOduration=135.014030025 podStartE2EDuration="2m17.132522433s" podCreationTimestamp="2026-04-23 08:51:27 +0000 UTC" firstStartedPulling="2026-04-23 08:53:41.665255118 +0000 UTC m=+166.640562723" lastFinishedPulling="2026-04-23 08:53:43.783747518 +0000 UTC m=+168.759055131" observedRunningTime="2026-04-23 08:53:44.131491971 +0000 UTC m=+169.106799599" watchObservedRunningTime="2026-04-23 08:53:44.132522433 +0000 UTC m=+169.107830059" Apr 23 08:53:45.542618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:45.542589 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:53:48.043214 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:48.043187 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7d564d9886-5ch26" Apr 23 08:53:51.540002 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:51.539968 2575 scope.go:117] "RemoveContainer" containerID="fe3a9226043c3fb7f8157c993a186fbde7d5053ab3b2bedc7c15be382d20b22a" Apr 23 08:53:52.136346 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.136318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/2.log" Apr 23 08:53:52.136502 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.136402 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" event={"ID":"b8ea5f2d-a09a-4865-8f65-103aa49ba68c","Type":"ContainerStarted","Data":"057dbf3f0e281c480be8b1b1a52e50fc9743d8e9d6f9bbfd16efa54060294271"} Apr 23 08:53:52.136695 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.136670 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:53:52.140643 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.140620 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-ddgns"] Apr 23 08:53:52.144025 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.144009 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" Apr 23 08:53:52.144105 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.144087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ddgns" Apr 23 08:53:52.146307 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.146286 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-27sb6\"" Apr 23 08:53:52.146402 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.146286 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 08:53:52.146507 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.146428 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 08:53:52.151421 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.151396 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ddgns"] Apr 23 08:53:52.153274 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.153238 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-dm56t" podStartSLOduration=51.251218526 podStartE2EDuration="54.153226436s" podCreationTimestamp="2026-04-23 08:52:58 +0000 UTC" firstStartedPulling="2026-04-23 08:52:59.216073035 +0000 UTC m=+124.191380640" lastFinishedPulling="2026-04-23 08:53:02.118080945 +0000 UTC m=+127.093388550" observedRunningTime="2026-04-23 08:53:52.152836603 +0000 UTC m=+177.128144229" watchObservedRunningTime="2026-04-23 08:53:52.153226436 +0000 UTC m=+177.128534066" Apr 23 08:53:52.212197 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.212169 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlc75\" (UniqueName: \"kubernetes.io/projected/1762b05e-88c6-410f-99cf-cbd73bd4ca6e-kube-api-access-jlc75\") pod \"downloads-6bcc868b7-ddgns\" (UID: \"1762b05e-88c6-410f-99cf-cbd73bd4ca6e\") " pod="openshift-console/downloads-6bcc868b7-ddgns" Apr 23 08:53:52.312963 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.312934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlc75\" (UniqueName: \"kubernetes.io/projected/1762b05e-88c6-410f-99cf-cbd73bd4ca6e-kube-api-access-jlc75\") pod \"downloads-6bcc868b7-ddgns\" (UID: \"1762b05e-88c6-410f-99cf-cbd73bd4ca6e\") " pod="openshift-console/downloads-6bcc868b7-ddgns" Apr 23 08:53:52.324921 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.324880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlc75\" (UniqueName: \"kubernetes.io/projected/1762b05e-88c6-410f-99cf-cbd73bd4ca6e-kube-api-access-jlc75\") pod \"downloads-6bcc868b7-ddgns\" (UID: \"1762b05e-88c6-410f-99cf-cbd73bd4ca6e\") " pod="openshift-console/downloads-6bcc868b7-ddgns" Apr 23 08:53:52.453249 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.453168 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ddgns" Apr 23 08:53:52.565732 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:52.565703 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ddgns"] Apr 23 08:53:52.569150 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:53:52.569111 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1762b05e_88c6_410f_99cf_cbd73bd4ca6e.slice/crio-0b249ec61e7968ca0b0399ed20102dd597ea9a0d25d3f1c00d6cdc71a0d09081 WatchSource:0}: Error finding container 0b249ec61e7968ca0b0399ed20102dd597ea9a0d25d3f1c00d6cdc71a0d09081: Status 404 returned error can't find the container with id 0b249ec61e7968ca0b0399ed20102dd597ea9a0d25d3f1c00d6cdc71a0d09081 Apr 23 08:53:53.140454 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:53.140413 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ddgns" event={"ID":"1762b05e-88c6-410f-99cf-cbd73bd4ca6e","Type":"ContainerStarted","Data":"0b249ec61e7968ca0b0399ed20102dd597ea9a0d25d3f1c00d6cdc71a0d09081"} Apr 23 08:53:54.118395 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:54.118360 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qm6xv" Apr 23 08:53:57.580437 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:57.580403 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:53:57.580892 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:53:57.580484 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:54:08.188400 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.188363 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ddgns" event={"ID":"1762b05e-88c6-410f-99cf-cbd73bd4ca6e","Type":"ContainerStarted","Data":"d9251d68cfd5a8c5a11e2bbd4b037d64a169b302186d2893528617af11e67994"} Apr 23 08:54:08.188846 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.188573 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-ddgns" Apr 23 08:54:08.190156 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.190125 2575 patch_prober.go:28] interesting pod/downloads-6bcc868b7-ddgns container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.19:8080/\": dial tcp 10.134.0.19:8080: connect: connection refused" start-of-body= Apr 23 08:54:08.190272 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.190181 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-ddgns" podUID="1762b05e-88c6-410f-99cf-cbd73bd4ca6e" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.19:8080/\": dial tcp 10.134.0.19:8080: connect: connection refused" Apr 23 08:54:08.206485 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.206441 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-ddgns" podStartSLOduration=0.721787997 podStartE2EDuration="16.206428889s" podCreationTimestamp="2026-04-23 08:53:52 +0000 UTC" firstStartedPulling="2026-04-23 08:53:52.571360911 +0000 UTC m=+177.546668520" lastFinishedPulling="2026-04-23 08:54:08.056001804 +0000 UTC m=+193.031309412" observedRunningTime="2026-04-23 08:54:08.205831777 +0000 UTC m=+193.181139415" watchObservedRunningTime="2026-04-23 08:54:08.206428889 +0000 UTC m=+193.181736515" Apr 23 08:54:08.246372 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.246341 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-668fcd8bd7-fmljf"] Apr 23 08:54:08.250909 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.250873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.253627 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.253604 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gz86g\"" Apr 23 08:54:08.253792 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.253606 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 08:54:08.254024 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.254003 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 08:54:08.254247 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.254077 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 08:54:08.254247 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.254102 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 08:54:08.254247 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.254113 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 08:54:08.259368 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.259343 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-oauth-config\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.259463 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.259445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-config\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.259517 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.259485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-trusted-ca-bundle\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.259517 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.259512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkkjv\" (UniqueName: \"kubernetes.io/projected/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-kube-api-access-rkkjv\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.259613 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.259536 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-serving-cert\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.259613 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.259558 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-oauth-serving-cert\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.259613 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.259600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-service-ca\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.259750 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.259650 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 08:54:08.260166 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.260060 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-668fcd8bd7-fmljf"] Apr 23 08:54:08.360268 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.360187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-oauth-config\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.360445 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.360294 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-config\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.360445 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.360341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-trusted-ca-bundle\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.360445 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.360371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkkjv\" (UniqueName: \"kubernetes.io/projected/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-kube-api-access-rkkjv\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.360445 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.360404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-serving-cert\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.360445 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.360430 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-oauth-serving-cert\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.360679 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.360481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-service-ca\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.361287 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.361258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-trusted-ca-bundle\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.363017 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.362973 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-oauth-config\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.363133 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.363058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-config\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.363189 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.363141 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-oauth-serving-cert\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.363249 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.363188 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-service-ca\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.363297 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.363261 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-serving-cert\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.373628 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.373604 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkkjv\" (UniqueName: \"kubernetes.io/projected/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-kube-api-access-rkkjv\") pod \"console-668fcd8bd7-fmljf\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.562885 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.562839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:08.695253 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:08.695221 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-668fcd8bd7-fmljf"] Apr 23 08:54:08.698041 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:54:08.697972 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00c4b9dc_a394_43a8_8d4f_2b4580a7c17c.slice/crio-0e7cc0825206a48b07186ee196ffd92d2ac44b0bc9d963eb1936eef0606579cd WatchSource:0}: Error finding container 0e7cc0825206a48b07186ee196ffd92d2ac44b0bc9d963eb1936eef0606579cd: Status 404 returned error can't find the container with id 0e7cc0825206a48b07186ee196ffd92d2ac44b0bc9d963eb1936eef0606579cd Apr 23 08:54:09.194073 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:09.194021 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668fcd8bd7-fmljf" event={"ID":"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c","Type":"ContainerStarted","Data":"0e7cc0825206a48b07186ee196ffd92d2ac44b0bc9d963eb1936eef0606579cd"} Apr 23 08:54:09.205183 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:09.205135 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-ddgns" Apr 23 08:54:13.208428 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:13.208386 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668fcd8bd7-fmljf" event={"ID":"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c","Type":"ContainerStarted","Data":"eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a"} Apr 23 08:54:13.226434 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:13.226386 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-668fcd8bd7-fmljf" podStartSLOduration=1.691355857 podStartE2EDuration="5.226371972s" podCreationTimestamp="2026-04-23 08:54:08 +0000 UTC" firstStartedPulling="2026-04-23 08:54:08.700318012 +0000 UTC m=+193.675625617" lastFinishedPulling="2026-04-23 08:54:12.235334111 +0000 UTC m=+197.210641732" observedRunningTime="2026-04-23 08:54:13.225028303 +0000 UTC m=+198.200335933" watchObservedRunningTime="2026-04-23 08:54:13.226371972 +0000 UTC m=+198.201679599" Apr 23 08:54:17.587488 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:17.587458 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:54:17.592251 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:17.592224 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6597794cf6-jxhvl" Apr 23 08:54:18.563352 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:18.563316 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:18.563352 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:18.563360 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:18.567655 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:18.567635 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:19.231311 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:19.231284 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:54:28.252758 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:28.252726 2575 generic.go:358] "Generic (PLEG): container finished" podID="ddc13db8-46f8-47be-b720-51cd59fd933a" containerID="64f4f97656a42a263546baa1a5c7b7f1357bc7bff9382b8abe40d9b14f0cc77e" exitCode=0 Apr 23 08:54:28.253176 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:28.252810 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nsgdp" event={"ID":"ddc13db8-46f8-47be-b720-51cd59fd933a","Type":"ContainerDied","Data":"64f4f97656a42a263546baa1a5c7b7f1357bc7bff9382b8abe40d9b14f0cc77e"} Apr 23 08:54:28.253216 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:28.253185 2575 scope.go:117] "RemoveContainer" containerID="64f4f97656a42a263546baa1a5c7b7f1357bc7bff9382b8abe40d9b14f0cc77e" Apr 23 08:54:29.257510 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:29.257474 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nsgdp" event={"ID":"ddc13db8-46f8-47be-b720-51cd59fd933a","Type":"ContainerStarted","Data":"33fc5f3591e85966c0da1efe1201c3942e11fd80c4d71f352ea12e31d2acdc72"} Apr 23 08:54:53.179507 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.179471 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:54:53.180005 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.179916 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="alertmanager" containerID="cri-o://6878bdf6f3ffcb2a45a1ac6a950ee9888f47ef02fdb55f88290efd0f2c8a0ced" gracePeriod=120 Apr 23 08:54:53.180005 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.179978 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy" containerID="cri-o://a58592cbea607021b0aa90332ddee10de2c5261eb8d9188a9ae93dfe72b08173" gracePeriod=120 Apr 23 08:54:53.180130 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.179982 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy-metric" containerID="cri-o://6b4f36016c9f73518e22845b7d95b18d00c75cac452a366e3c8a886081f6a275" gracePeriod=120 Apr 23 08:54:53.180130 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.180012 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="config-reloader" containerID="cri-o://e140eb3fb83d586efaa772c49bceb09f2377cd3bcc65bdf82749d8862f47ce8f" gracePeriod=120 Apr 23 08:54:53.180130 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.180047 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="prom-label-proxy" containerID="cri-o://7dc289cce5cc28145605bb43811296e5a5b6fce7b6e32bdcdf13e8dc653bc576" gracePeriod=120 Apr 23 08:54:53.180130 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.179978 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy-web" containerID="cri-o://4362ffea44bf09022470b26fe2af27e9e77ee52465b3c2e674f662a360369abd" gracePeriod=120 Apr 23 08:54:53.329361 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.329331 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerID="7dc289cce5cc28145605bb43811296e5a5b6fce7b6e32bdcdf13e8dc653bc576" exitCode=0 Apr 23 08:54:53.329361 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.329353 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerID="a58592cbea607021b0aa90332ddee10de2c5261eb8d9188a9ae93dfe72b08173" exitCode=0 Apr 23 08:54:53.329361 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.329359 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerID="e140eb3fb83d586efaa772c49bceb09f2377cd3bcc65bdf82749d8862f47ce8f" exitCode=0 Apr 23 08:54:53.329361 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.329365 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerID="6878bdf6f3ffcb2a45a1ac6a950ee9888f47ef02fdb55f88290efd0f2c8a0ced" exitCode=0 Apr 23 08:54:53.329542 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.329369 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerDied","Data":"7dc289cce5cc28145605bb43811296e5a5b6fce7b6e32bdcdf13e8dc653bc576"} Apr 23 08:54:53.329542 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.329405 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerDied","Data":"a58592cbea607021b0aa90332ddee10de2c5261eb8d9188a9ae93dfe72b08173"} Apr 23 08:54:53.329542 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.329414 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerDied","Data":"e140eb3fb83d586efaa772c49bceb09f2377cd3bcc65bdf82749d8862f47ce8f"} Apr 23 08:54:53.329542 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:53.329423 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerDied","Data":"6878bdf6f3ffcb2a45a1ac6a950ee9888f47ef02fdb55f88290efd0f2c8a0ced"} Apr 23 08:54:54.336565 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.336538 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerID="6b4f36016c9f73518e22845b7d95b18d00c75cac452a366e3c8a886081f6a275" exitCode=0 Apr 23 08:54:54.336565 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.336561 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerID="4362ffea44bf09022470b26fe2af27e9e77ee52465b3c2e674f662a360369abd" exitCode=0 Apr 23 08:54:54.336961 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.336616 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerDied","Data":"6b4f36016c9f73518e22845b7d95b18d00c75cac452a366e3c8a886081f6a275"} Apr 23 08:54:54.336961 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.336661 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerDied","Data":"4362ffea44bf09022470b26fe2af27e9e77ee52465b3c2e674f662a360369abd"} Apr 23 08:54:54.442511 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.442485 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:54.464728 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.464701 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-cluster-tls-config\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.464832 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.464736 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-metrics-client-ca\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.464832 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.464759 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-out\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.464832 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.464790 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-main-tls\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.465259 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.464842 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-volume\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.465259 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.464875 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-tls-assets\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.465259 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.464925 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.465259 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.464956 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-main-db\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.465259 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.464996 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.465259 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.465028 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-web-config\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.465259 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.465078 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-trusted-ca-bundle\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.465259 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.465105 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-web\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.465259 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.465130 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7frv\" (UniqueName: \"kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-kube-api-access-p7frv\") pod \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\" (UID: \"1a4d8dea-d968-447a-ac8c-b695ed740c1a\") " Apr 23 08:54:54.465259 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.465147 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:54:54.465759 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.465431 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:54:54.465759 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.465578 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:54:54.465853 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.465800 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-metrics-client-ca\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.465853 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.465828 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-main-db\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.465853 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.465850 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4d8dea-d968-447a-ac8c-b695ed740c1a-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.468101 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.468053 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:54.468101 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.468089 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:54.468278 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.468133 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:54:54.468581 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.468540 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-volume" (OuterVolumeSpecName: "config-volume") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:54.468848 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.468819 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:54.468968 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.468949 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:54.470191 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.470166 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-out" (OuterVolumeSpecName: "config-out") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:54:54.470336 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.470293 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-kube-api-access-p7frv" (OuterVolumeSpecName: "kube-api-access-p7frv") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "kube-api-access-p7frv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:54:54.474837 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.474813 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:54.482092 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.482064 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-web-config" (OuterVolumeSpecName: "web-config") pod "1a4d8dea-d968-447a-ac8c-b695ed740c1a" (UID: "1a4d8dea-d968-447a-ac8c-b695ed740c1a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:54:54.566892 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.566835 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.566892 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.566857 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p7frv\" (UniqueName: \"kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-kube-api-access-p7frv\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.566892 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.566867 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-cluster-tls-config\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.566892 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.566876 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-out\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.566892 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.566885 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-main-tls\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.566892 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.566893 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-config-volume\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.567157 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.566924 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a4d8dea-d968-447a-ac8c-b695ed740c1a-tls-assets\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.567157 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.566937 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.567157 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.566953 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:54.567157 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:54.566964 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a4d8dea-d968-447a-ac8c-b695ed740c1a-web-config\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:54:55.341651 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.341614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1a4d8dea-d968-447a-ac8c-b695ed740c1a","Type":"ContainerDied","Data":"fccc3093410bebe059a0d3f2b42a84515585767fdc9691cac496228a197f4358"} Apr 23 08:54:55.342069 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.341661 2575 scope.go:117] "RemoveContainer" containerID="7dc289cce5cc28145605bb43811296e5a5b6fce7b6e32bdcdf13e8dc653bc576" Apr 23 08:54:55.342069 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.341675 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.348875 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.348855 2575 scope.go:117] "RemoveContainer" containerID="6b4f36016c9f73518e22845b7d95b18d00c75cac452a366e3c8a886081f6a275" Apr 23 08:54:55.355726 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.355708 2575 scope.go:117] "RemoveContainer" containerID="a58592cbea607021b0aa90332ddee10de2c5261eb8d9188a9ae93dfe72b08173" Apr 23 08:54:55.364527 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.364508 2575 scope.go:117] "RemoveContainer" containerID="4362ffea44bf09022470b26fe2af27e9e77ee52465b3c2e674f662a360369abd" Apr 23 08:54:55.365748 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.365726 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:54:55.369641 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.369623 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:54:55.371803 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.371785 2575 scope.go:117] "RemoveContainer" containerID="e140eb3fb83d586efaa772c49bceb09f2377cd3bcc65bdf82749d8862f47ce8f" Apr 23 08:54:55.377974 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.377958 2575 scope.go:117] "RemoveContainer" containerID="6878bdf6f3ffcb2a45a1ac6a950ee9888f47ef02fdb55f88290efd0f2c8a0ced" Apr 23 08:54:55.383936 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.383920 2575 scope.go:117] "RemoveContainer" containerID="3e5423673e4c0a8ee97ce1f3772a76bc5203f01c4daea47584672a679a49ae0e" Apr 23 08:54:55.391976 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.391957 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:54:55.392277 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392262 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy" Apr 23 08:54:55.392348 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392280 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy" Apr 23 08:54:55.392348 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392293 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="prom-label-proxy" Apr 23 08:54:55.392348 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392303 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="prom-label-proxy" Apr 23 08:54:55.392348 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392327 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy-web" Apr 23 08:54:55.392348 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392336 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy-web" Apr 23 08:54:55.392348 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392348 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="init-config-reloader" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392356 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="init-config-reloader" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392366 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="config-reloader" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392374 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="config-reloader" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392384 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="alertmanager" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392393 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="alertmanager" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392409 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy-metric" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392417 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy-metric" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392487 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="alertmanager" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392501 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy-metric" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392514 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="config-reloader" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392523 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy-web" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392534 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="prom-label-proxy" Apr 23 08:54:55.392618 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.392544 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" containerName="kube-rbac-proxy" Apr 23 08:54:55.398206 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.398188 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.400663 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.400646 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 08:54:55.400782 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.400764 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 08:54:55.400832 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.400772 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 08:54:55.400887 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.400771 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 08:54:55.400887 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.400867 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 08:54:55.401012 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.400886 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 08:54:55.401012 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.400952 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 08:54:55.401326 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.401308 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dswlz\"" Apr 23 08:54:55.401456 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.401441 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 08:54:55.405755 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.405738 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 08:54:55.407811 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.407792 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:54:55.474447 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474419 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx575\" (UniqueName: \"kubernetes.io/projected/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-kube-api-access-dx575\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474575 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-config-volume\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474575 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474575 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474575 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474575 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474733 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474733 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474624 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474733 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-config-out\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474733 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-web-config\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474845 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474845 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.474845 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.474821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.545182 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.545113 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a4d8dea-d968-447a-ac8c-b695ed740c1a" path="/var/lib/kubelet/pods/1a4d8dea-d968-447a-ac8c-b695ed740c1a/volumes" Apr 23 08:54:55.575973 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.575948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576062 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.575985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576062 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.576013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576062 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.576038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dx575\" (UniqueName: \"kubernetes.io/projected/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-kube-api-access-dx575\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576234 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.576069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-config-volume\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576234 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.576134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576234 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.576171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576234 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.576216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576429 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.576244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576429 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.576280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576429 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.576326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576429 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.576364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-config-out\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.576429 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.576422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-web-config\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.577891 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.577863 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.579333 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.578890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.579333 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.578989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.579333 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.579049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.579333 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.579140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.579333 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.579275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.579577 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.579355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.579577 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.579506 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-config-volume\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.579812 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.579765 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.580101 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.580081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.580354 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.580334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-config-out\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.580907 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.580881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-web-config\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.584269 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.584247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx575\" (UniqueName: \"kubernetes.io/projected/64d31fe5-d470-40ef-ac7c-e06d9804bc3b-kube-api-access-dx575\") pod \"alertmanager-main-0\" (UID: \"64d31fe5-d470-40ef-ac7c-e06d9804bc3b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.708220 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.708186 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 08:54:55.829379 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:55.829306 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 08:54:55.832757 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:54:55.832713 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64d31fe5_d470_40ef_ac7c_e06d9804bc3b.slice/crio-09a1783d11064927e0db2d9e40e54bf3582448e3b26e55bf7021639559a6d267 WatchSource:0}: Error finding container 09a1783d11064927e0db2d9e40e54bf3582448e3b26e55bf7021639559a6d267: Status 404 returned error can't find the container with id 09a1783d11064927e0db2d9e40e54bf3582448e3b26e55bf7021639559a6d267 Apr 23 08:54:56.345799 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:56.345766 2575 generic.go:358] "Generic (PLEG): container finished" podID="64d31fe5-d470-40ef-ac7c-e06d9804bc3b" containerID="a2ae40cf8e45cc9bce4fe9612730d5a5792aa0ba56e70f98c66997fcc436b604" exitCode=0 Apr 23 08:54:56.345799 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:56.345800 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"64d31fe5-d470-40ef-ac7c-e06d9804bc3b","Type":"ContainerDied","Data":"a2ae40cf8e45cc9bce4fe9612730d5a5792aa0ba56e70f98c66997fcc436b604"} Apr 23 08:54:56.346318 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:56.345821 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"64d31fe5-d470-40ef-ac7c-e06d9804bc3b","Type":"ContainerStarted","Data":"09a1783d11064927e0db2d9e40e54bf3582448e3b26e55bf7021639559a6d267"} Apr 23 08:54:57.351157 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:57.351120 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"64d31fe5-d470-40ef-ac7c-e06d9804bc3b","Type":"ContainerStarted","Data":"78a50186cad2e22f48c40d63791d0deac9b5443faab5ab6472a4f095b011673c"} Apr 23 08:54:57.351543 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:57.351164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"64d31fe5-d470-40ef-ac7c-e06d9804bc3b","Type":"ContainerStarted","Data":"f7fcab1d710b972653b77c1af88ddfe5c1310d901af078ae3d6caf76b5e4c4d3"} Apr 23 08:54:57.351543 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:57.351179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"64d31fe5-d470-40ef-ac7c-e06d9804bc3b","Type":"ContainerStarted","Data":"35734a94f08c81fe68fa74e357cb6e0e734ae2e11d08de35d71f6be72fb37430"} Apr 23 08:54:57.351543 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:57.351194 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"64d31fe5-d470-40ef-ac7c-e06d9804bc3b","Type":"ContainerStarted","Data":"75285886a3231d13b2336fef35947b537c40aa160be7382943d3614e6d6ae6e2"} Apr 23 08:54:57.351543 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:57.351206 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"64d31fe5-d470-40ef-ac7c-e06d9804bc3b","Type":"ContainerStarted","Data":"047cf271b980b6fde1abf16e76553676b69e5f9bd7601535ccbce7c387d5f1fe"} Apr 23 08:54:57.351543 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:57.351218 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"64d31fe5-d470-40ef-ac7c-e06d9804bc3b","Type":"ContainerStarted","Data":"2411a96e5ae7540eed3667b4b1c0e895b39578e0e0cd1681edc712ffeed0e18c"} Apr 23 08:54:57.377877 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:54:57.377833 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.377817391 podStartE2EDuration="2.377817391s" podCreationTimestamp="2026-04-23 08:54:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 08:54:57.376099339 +0000 UTC m=+242.351406976" watchObservedRunningTime="2026-04-23 08:54:57.377817391 +0000 UTC m=+242.353125017" Apr 23 08:55:01.085082 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:01.085048 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-668fcd8bd7-fmljf"] Apr 23 08:55:07.369820 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:07.369774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:55:07.372079 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:07.372059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c32e908b-8a1f-4d28-99e1-dce39209186a-metrics-certs\") pod \"network-metrics-daemon-9tmnv\" (UID: \"c32e908b-8a1f-4d28-99e1-dce39209186a\") " pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:55:07.446659 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:07.446627 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wtr6s\"" Apr 23 08:55:07.453991 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:07.453965 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9tmnv" Apr 23 08:55:07.568558 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:07.568532 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9tmnv"] Apr 23 08:55:07.571398 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:55:07.571367 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32e908b_8a1f_4d28_99e1_dce39209186a.slice/crio-3d492b48a73d8728499480ccf1ac660094159c698f8f4746deb549d311dc5f45 WatchSource:0}: Error finding container 3d492b48a73d8728499480ccf1ac660094159c698f8f4746deb549d311dc5f45: Status 404 returned error can't find the container with id 3d492b48a73d8728499480ccf1ac660094159c698f8f4746deb549d311dc5f45 Apr 23 08:55:08.385802 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:08.385764 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9tmnv" event={"ID":"c32e908b-8a1f-4d28-99e1-dce39209186a","Type":"ContainerStarted","Data":"3d492b48a73d8728499480ccf1ac660094159c698f8f4746deb549d311dc5f45"} Apr 23 08:55:09.390131 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:09.390094 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9tmnv" event={"ID":"c32e908b-8a1f-4d28-99e1-dce39209186a","Type":"ContainerStarted","Data":"2422a8b11e6ed4b875d4252f9fee06c4c80fad6a75babd37324fcc41cc9ebb2b"} Apr 23 08:55:09.390131 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:09.390133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9tmnv" event={"ID":"c32e908b-8a1f-4d28-99e1-dce39209186a","Type":"ContainerStarted","Data":"36acfee5b5109f665b1f6c980e2b635f7d65dc911226131b123327631c2a0b83"} Apr 23 08:55:09.406561 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:09.406519 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9tmnv" podStartSLOduration=253.421669791 podStartE2EDuration="4m14.406505323s" podCreationTimestamp="2026-04-23 08:50:55 +0000 UTC" firstStartedPulling="2026-04-23 08:55:07.573110828 +0000 UTC m=+252.548418433" lastFinishedPulling="2026-04-23 08:55:08.557946349 +0000 UTC m=+253.533253965" observedRunningTime="2026-04-23 08:55:09.405943829 +0000 UTC m=+254.381251520" watchObservedRunningTime="2026-04-23 08:55:09.406505323 +0000 UTC m=+254.381813012" Apr 23 08:55:18.278668 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.278585 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qb26v"] Apr 23 08:55:18.281870 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.281852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qb26v" Apr 23 08:55:18.284342 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.284325 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 08:55:18.288921 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.288886 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qb26v"] Apr 23 08:55:18.361139 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.361106 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f6bb4535-9d07-4788-b02b-2c58c53b4191-dbus\") pod \"global-pull-secret-syncer-qb26v\" (UID: \"f6bb4535-9d07-4788-b02b-2c58c53b4191\") " pod="kube-system/global-pull-secret-syncer-qb26v" Apr 23 08:55:18.361271 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.361144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f6bb4535-9d07-4788-b02b-2c58c53b4191-original-pull-secret\") pod \"global-pull-secret-syncer-qb26v\" (UID: \"f6bb4535-9d07-4788-b02b-2c58c53b4191\") " pod="kube-system/global-pull-secret-syncer-qb26v" Apr 23 08:55:18.361271 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.361163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f6bb4535-9d07-4788-b02b-2c58c53b4191-kubelet-config\") pod \"global-pull-secret-syncer-qb26v\" (UID: \"f6bb4535-9d07-4788-b02b-2c58c53b4191\") " pod="kube-system/global-pull-secret-syncer-qb26v" Apr 23 08:55:18.462296 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.462263 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f6bb4535-9d07-4788-b02b-2c58c53b4191-dbus\") pod \"global-pull-secret-syncer-qb26v\" (UID: \"f6bb4535-9d07-4788-b02b-2c58c53b4191\") " pod="kube-system/global-pull-secret-syncer-qb26v" Apr 23 08:55:18.462296 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.462300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f6bb4535-9d07-4788-b02b-2c58c53b4191-original-pull-secret\") pod \"global-pull-secret-syncer-qb26v\" (UID: \"f6bb4535-9d07-4788-b02b-2c58c53b4191\") " pod="kube-system/global-pull-secret-syncer-qb26v" Apr 23 08:55:18.462482 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.462319 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f6bb4535-9d07-4788-b02b-2c58c53b4191-kubelet-config\") pod \"global-pull-secret-syncer-qb26v\" (UID: \"f6bb4535-9d07-4788-b02b-2c58c53b4191\") " pod="kube-system/global-pull-secret-syncer-qb26v" Apr 23 08:55:18.462482 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.462442 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f6bb4535-9d07-4788-b02b-2c58c53b4191-kubelet-config\") pod \"global-pull-secret-syncer-qb26v\" (UID: \"f6bb4535-9d07-4788-b02b-2c58c53b4191\") " pod="kube-system/global-pull-secret-syncer-qb26v" Apr 23 08:55:18.462482 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.462459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f6bb4535-9d07-4788-b02b-2c58c53b4191-dbus\") pod \"global-pull-secret-syncer-qb26v\" (UID: \"f6bb4535-9d07-4788-b02b-2c58c53b4191\") " pod="kube-system/global-pull-secret-syncer-qb26v" Apr 23 08:55:18.464571 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.464550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f6bb4535-9d07-4788-b02b-2c58c53b4191-original-pull-secret\") pod \"global-pull-secret-syncer-qb26v\" (UID: \"f6bb4535-9d07-4788-b02b-2c58c53b4191\") " pod="kube-system/global-pull-secret-syncer-qb26v" Apr 23 08:55:18.591680 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.591604 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qb26v" Apr 23 08:55:18.709647 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:18.709599 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qb26v"] Apr 23 08:55:18.713382 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:55:18.713343 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6bb4535_9d07_4788_b02b_2c58c53b4191.slice/crio-bfd0938bb0de170d6d6b7c1b7550eb502be1ea5d74337602bc72e1d047ef7b5e WatchSource:0}: Error finding container bfd0938bb0de170d6d6b7c1b7550eb502be1ea5d74337602bc72e1d047ef7b5e: Status 404 returned error can't find the container with id bfd0938bb0de170d6d6b7c1b7550eb502be1ea5d74337602bc72e1d047ef7b5e Apr 23 08:55:19.419819 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:19.419775 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qb26v" event={"ID":"f6bb4535-9d07-4788-b02b-2c58c53b4191","Type":"ContainerStarted","Data":"bfd0938bb0de170d6d6b7c1b7550eb502be1ea5d74337602bc72e1d047ef7b5e"} Apr 23 08:55:23.432768 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:23.432732 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qb26v" event={"ID":"f6bb4535-9d07-4788-b02b-2c58c53b4191","Type":"ContainerStarted","Data":"fe60782b08003a230150e36af3d13c8bfb647805847114d4b3e632158a897efb"} Apr 23 08:55:23.449575 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:23.449528 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qb26v" podStartSLOduration=1.759968711 podStartE2EDuration="5.449515582s" podCreationTimestamp="2026-04-23 08:55:18 +0000 UTC" firstStartedPulling="2026-04-23 08:55:18.714934462 +0000 UTC m=+263.690242071" lastFinishedPulling="2026-04-23 08:55:22.404481337 +0000 UTC m=+267.379788942" observedRunningTime="2026-04-23 08:55:23.448628477 +0000 UTC m=+268.423936106" watchObservedRunningTime="2026-04-23 08:55:23.449515582 +0000 UTC m=+268.424823205" Apr 23 08:55:26.103797 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.103735 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-668fcd8bd7-fmljf" podUID="00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" containerName="console" containerID="cri-o://eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a" gracePeriod=15 Apr 23 08:55:26.347873 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.347852 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-668fcd8bd7-fmljf_00c4b9dc-a394-43a8-8d4f-2b4580a7c17c/console/0.log" Apr 23 08:55:26.348002 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.347933 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:55:26.431658 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.431594 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-serving-cert\") pod \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " Apr 23 08:55:26.431658 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.431649 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-service-ca\") pod \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " Apr 23 08:55:26.431856 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.431695 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-trusted-ca-bundle\") pod \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " Apr 23 08:55:26.431856 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.431721 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-oauth-config\") pod \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " Apr 23 08:55:26.431856 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.431754 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-oauth-serving-cert\") pod \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " Apr 23 08:55:26.431856 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.431774 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-config\") pod \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " Apr 23 08:55:26.431856 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.431805 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkkjv\" (UniqueName: \"kubernetes.io/projected/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-kube-api-access-rkkjv\") pod \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\" (UID: \"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c\") " Apr 23 08:55:26.432191 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.432159 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" (UID: "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:55:26.432258 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.432181 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-service-ca" (OuterVolumeSpecName: "service-ca") pod "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" (UID: "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:55:26.432258 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.432198 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-config" (OuterVolumeSpecName: "console-config") pod "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" (UID: "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:55:26.432332 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.432245 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" (UID: "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 08:55:26.433943 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.433915 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" (UID: "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:55:26.434053 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.433949 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-kube-api-access-rkkjv" (OuterVolumeSpecName: "kube-api-access-rkkjv") pod "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" (UID: "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c"). InnerVolumeSpecName "kube-api-access-rkkjv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:55:26.434053 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.434026 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" (UID: "00c4b9dc-a394-43a8-8d4f-2b4580a7c17c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 08:55:26.442370 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.442353 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-668fcd8bd7-fmljf_00c4b9dc-a394-43a8-8d4f-2b4580a7c17c/console/0.log" Apr 23 08:55:26.442477 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.442384 2575 generic.go:358] "Generic (PLEG): container finished" podID="00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" containerID="eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a" exitCode=2 Apr 23 08:55:26.442477 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.442444 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668fcd8bd7-fmljf" event={"ID":"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c","Type":"ContainerDied","Data":"eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a"} Apr 23 08:55:26.442477 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.442469 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-668fcd8bd7-fmljf" event={"ID":"00c4b9dc-a394-43a8-8d4f-2b4580a7c17c","Type":"ContainerDied","Data":"0e7cc0825206a48b07186ee196ffd92d2ac44b0bc9d963eb1936eef0606579cd"} Apr 23 08:55:26.442639 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.442482 2575 scope.go:117] "RemoveContainer" containerID="eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a" Apr 23 08:55:26.442639 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.442445 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-668fcd8bd7-fmljf" Apr 23 08:55:26.453010 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.452993 2575 scope.go:117] "RemoveContainer" containerID="eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a" Apr 23 08:55:26.453255 ip-10-0-141-250 kubenswrapper[2575]: E0423 08:55:26.453236 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a\": container with ID starting with eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a not found: ID does not exist" containerID="eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a" Apr 23 08:55:26.453300 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.453262 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a"} err="failed to get container status \"eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a\": rpc error: code = NotFound desc = could not find container \"eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a\": container with ID starting with eff4715f3bc5062bebd066bb31730c599ec0ee3d516db72ae9bba549490ecf8a not found: ID does not exist" Apr 23 08:55:26.464470 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.464450 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-668fcd8bd7-fmljf"] Apr 23 08:55:26.467879 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.467852 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-668fcd8bd7-fmljf"] Apr 23 08:55:26.532493 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.532462 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-trusted-ca-bundle\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:55:26.532493 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.532490 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-oauth-config\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:55:26.532651 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.532506 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-oauth-serving-cert\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:55:26.532651 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.532520 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-config\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:55:26.532651 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.532532 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rkkjv\" (UniqueName: \"kubernetes.io/projected/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-kube-api-access-rkkjv\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:55:26.532651 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.532544 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-console-serving-cert\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:55:26.532651 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:26.532558 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c-service-ca\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:55:27.544183 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:27.544153 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" path="/var/lib/kubelet/pods/00c4b9dc-a394-43a8-8d4f-2b4580a7c17c/volumes" Apr 23 08:55:42.989354 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:42.989321 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt"] Apr 23 08:55:42.989730 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:42.989631 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" containerName="console" Apr 23 08:55:42.989730 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:42.989643 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" containerName="console" Apr 23 08:55:42.989730 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:42.989705 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="00c4b9dc-a394-43a8-8d4f-2b4580a7c17c" containerName="console" Apr 23 08:55:42.991477 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:42.991462 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:55:42.994006 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:42.993979 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 08:55:42.994006 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:42.993999 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 08:55:42.994155 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:42.994094 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-xlwpr\"" Apr 23 08:55:43.000803 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.000784 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt"] Apr 23 08:55:43.171867 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.171832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:55:43.172063 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.171874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t54kt\" (UniqueName: \"kubernetes.io/projected/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-kube-api-access-t54kt\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:55:43.172063 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.171937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:55:43.272657 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.272566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:55:43.272657 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.272620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t54kt\" (UniqueName: \"kubernetes.io/projected/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-kube-api-access-t54kt\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:55:43.272848 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.272660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:55:43.273001 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.272977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:55:43.273076 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.273047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:55:43.281664 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.281641 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t54kt\" (UniqueName: \"kubernetes.io/projected/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-kube-api-access-t54kt\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:55:43.300005 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.299981 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:55:43.619570 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:43.619548 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt"] Apr 23 08:55:43.621942 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:55:43.621914 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8bbbcbe_e510_4785_aac4_df3e7c59bfb9.slice/crio-9f8a6f8f6e651bfc2a629ed2793f6fb26dc985471858dc9673ff64a6f2a9684b WatchSource:0}: Error finding container 9f8a6f8f6e651bfc2a629ed2793f6fb26dc985471858dc9673ff64a6f2a9684b: Status 404 returned error can't find the container with id 9f8a6f8f6e651bfc2a629ed2793f6fb26dc985471858dc9673ff64a6f2a9684b Apr 23 08:55:44.493773 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:44.493733 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" event={"ID":"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9","Type":"ContainerStarted","Data":"9f8a6f8f6e651bfc2a629ed2793f6fb26dc985471858dc9673ff64a6f2a9684b"} Apr 23 08:55:49.510853 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:49.510819 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" containerID="100168c469423985ca6a8016afe7f313bea65ee3eb461eeae3dce3ad725e106e" exitCode=0 Apr 23 08:55:49.511354 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:49.510920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" event={"ID":"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9","Type":"ContainerDied","Data":"100168c469423985ca6a8016afe7f313bea65ee3eb461eeae3dce3ad725e106e"} Apr 23 08:55:51.518999 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:51.518965 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" containerID="d2d4fbce1b0cb6b71f7ddb7bb54ef4e361e19d16d746466a3ae8f33b8d98411d" exitCode=0 Apr 23 08:55:51.519342 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:51.519023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" event={"ID":"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9","Type":"ContainerDied","Data":"d2d4fbce1b0cb6b71f7ddb7bb54ef4e361e19d16d746466a3ae8f33b8d98411d"} Apr 23 08:55:55.433622 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:55.433532 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/2.log" Apr 23 08:55:55.434642 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:55.434572 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/2.log" Apr 23 08:55:55.445270 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:55.445240 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 08:55:59.545991 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:59.545963 2575 generic.go:358] "Generic (PLEG): container finished" podID="f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" containerID="b021a46b83723b64aeba670ccf8b28e5b0d1c7f86864a02425e960167836d5e1" exitCode=0 Apr 23 08:55:59.546301 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:55:59.546008 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" event={"ID":"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9","Type":"ContainerDied","Data":"b021a46b83723b64aeba670ccf8b28e5b0d1c7f86864a02425e960167836d5e1"} Apr 23 08:56:00.672462 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:00.672438 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:56:00.712734 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:00.712704 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-bundle\") pod \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " Apr 23 08:56:00.712934 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:00.712752 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-util\") pod \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " Apr 23 08:56:00.712934 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:00.712818 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t54kt\" (UniqueName: \"kubernetes.io/projected/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-kube-api-access-t54kt\") pod \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\" (UID: \"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9\") " Apr 23 08:56:00.713614 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:00.713578 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-bundle" (OuterVolumeSpecName: "bundle") pod "f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" (UID: "f8bbbcbe-e510-4785-aac4-df3e7c59bfb9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:56:00.714965 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:00.714931 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-kube-api-access-t54kt" (OuterVolumeSpecName: "kube-api-access-t54kt") pod "f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" (UID: "f8bbbcbe-e510-4785-aac4-df3e7c59bfb9"). InnerVolumeSpecName "kube-api-access-t54kt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:56:00.717263 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:00.717241 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-util" (OuterVolumeSpecName: "util") pod "f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" (UID: "f8bbbcbe-e510-4785-aac4-df3e7c59bfb9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:56:00.813915 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:00.813804 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t54kt\" (UniqueName: \"kubernetes.io/projected/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-kube-api-access-t54kt\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:56:00.813915 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:00.813837 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-bundle\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:56:00.813915 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:00.813846 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8bbbcbe-e510-4785-aac4-df3e7c59bfb9-util\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:56:01.553647 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:01.553618 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" event={"ID":"f8bbbcbe-e510-4785-aac4-df3e7c59bfb9","Type":"ContainerDied","Data":"9f8a6f8f6e651bfc2a629ed2793f6fb26dc985471858dc9673ff64a6f2a9684b"} Apr 23 08:56:01.553647 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:01.553649 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f8a6f8f6e651bfc2a629ed2793f6fb26dc985471858dc9673ff64a6f2a9684b" Apr 23 08:56:01.553799 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:01.553650 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d9q7pt" Apr 23 08:56:05.745653 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.745620 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm"] Apr 23 08:56:05.746051 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.745916 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" containerName="extract" Apr 23 08:56:05.746051 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.745929 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" containerName="extract" Apr 23 08:56:05.746051 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.745949 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" containerName="util" Apr 23 08:56:05.746051 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.745957 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" containerName="util" Apr 23 08:56:05.746051 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.745974 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" containerName="pull" Apr 23 08:56:05.746051 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.745979 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" containerName="pull" Apr 23 08:56:05.746051 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.746041 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8bbbcbe-e510-4785-aac4-df3e7c59bfb9" containerName="extract" Apr 23 08:56:05.793453 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.793420 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm"] Apr 23 08:56:05.793586 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.793533 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm" Apr 23 08:56:05.796403 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.796368 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:56:05.796535 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.796454 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 23 08:56:05.796535 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.796475 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-6bqxr\"" Apr 23 08:56:05.853005 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.852977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22bceebd-44cd-4d1e-889b-8e0c388d2a11-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-x57qm\" (UID: \"22bceebd-44cd-4d1e-889b-8e0c388d2a11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm" Apr 23 08:56:05.853155 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.853023 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl9r4\" (UniqueName: \"kubernetes.io/projected/22bceebd-44cd-4d1e-889b-8e0c388d2a11-kube-api-access-xl9r4\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-x57qm\" (UID: \"22bceebd-44cd-4d1e-889b-8e0c388d2a11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm" Apr 23 08:56:05.953479 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.953445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22bceebd-44cd-4d1e-889b-8e0c388d2a11-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-x57qm\" (UID: \"22bceebd-44cd-4d1e-889b-8e0c388d2a11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm" Apr 23 08:56:05.953633 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.953499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl9r4\" (UniqueName: \"kubernetes.io/projected/22bceebd-44cd-4d1e-889b-8e0c388d2a11-kube-api-access-xl9r4\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-x57qm\" (UID: \"22bceebd-44cd-4d1e-889b-8e0c388d2a11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm" Apr 23 08:56:05.953822 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.953802 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22bceebd-44cd-4d1e-889b-8e0c388d2a11-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-x57qm\" (UID: \"22bceebd-44cd-4d1e-889b-8e0c388d2a11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm" Apr 23 08:56:05.961321 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:05.961299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl9r4\" (UniqueName: \"kubernetes.io/projected/22bceebd-44cd-4d1e-889b-8e0c388d2a11-kube-api-access-xl9r4\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-x57qm\" (UID: \"22bceebd-44cd-4d1e-889b-8e0c388d2a11\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm" Apr 23 08:56:06.103097 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:06.103018 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm" Apr 23 08:56:06.224688 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:06.224663 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm"] Apr 23 08:56:06.226958 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:56:06.226932 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22bceebd_44cd_4d1e_889b_8e0c388d2a11.slice/crio-a7132a3ede2a7e16f8cd41ed0683bcfdeecc6bcc2ccb277384c4743eed54da9c WatchSource:0}: Error finding container a7132a3ede2a7e16f8cd41ed0683bcfdeecc6bcc2ccb277384c4743eed54da9c: Status 404 returned error can't find the container with id a7132a3ede2a7e16f8cd41ed0683bcfdeecc6bcc2ccb277384c4743eed54da9c Apr 23 08:56:06.229354 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:06.229335 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 08:56:06.569208 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:06.569175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm" event={"ID":"22bceebd-44cd-4d1e-889b-8e0c388d2a11","Type":"ContainerStarted","Data":"a7132a3ede2a7e16f8cd41ed0683bcfdeecc6bcc2ccb277384c4743eed54da9c"} Apr 23 08:56:08.577427 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:08.577351 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm" event={"ID":"22bceebd-44cd-4d1e-889b-8e0c388d2a11","Type":"ContainerStarted","Data":"d5be30f33d22fbac7a960b3a721c044563b630959dc9b9b3f66a7bd20c479155"} Apr 23 08:56:08.601984 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:08.601940 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-x57qm" podStartSLOduration=1.579539607 podStartE2EDuration="3.60192535s" podCreationTimestamp="2026-04-23 08:56:05 +0000 UTC" firstStartedPulling="2026-04-23 08:56:06.229465019 +0000 UTC m=+311.204772625" lastFinishedPulling="2026-04-23 08:56:08.251850759 +0000 UTC m=+313.227158368" observedRunningTime="2026-04-23 08:56:08.599354811 +0000 UTC m=+313.574662441" watchObservedRunningTime="2026-04-23 08:56:08.60192535 +0000 UTC m=+313.577232975" Apr 23 08:56:12.124991 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.124957 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-qzr8s"] Apr 23 08:56:12.128600 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.128578 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-qzr8s" Apr 23 08:56:12.130926 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.130884 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-zzchp\"" Apr 23 08:56:12.131102 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.131079 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 08:56:12.131181 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.131113 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 08:56:12.138190 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.138171 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-qzr8s"] Apr 23 08:56:12.207186 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.207157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4nk2\" (UniqueName: \"kubernetes.io/projected/62eb9c44-191a-44fd-8e28-330c344b9c94-kube-api-access-z4nk2\") pod \"cert-manager-cainjector-68b757865b-qzr8s\" (UID: \"62eb9c44-191a-44fd-8e28-330c344b9c94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qzr8s" Apr 23 08:56:12.207309 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.207203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62eb9c44-191a-44fd-8e28-330c344b9c94-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-qzr8s\" (UID: \"62eb9c44-191a-44fd-8e28-330c344b9c94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qzr8s" Apr 23 08:56:12.308499 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.308453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62eb9c44-191a-44fd-8e28-330c344b9c94-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-qzr8s\" (UID: \"62eb9c44-191a-44fd-8e28-330c344b9c94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qzr8s" Apr 23 08:56:12.308627 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.308562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4nk2\" (UniqueName: \"kubernetes.io/projected/62eb9c44-191a-44fd-8e28-330c344b9c94-kube-api-access-z4nk2\") pod \"cert-manager-cainjector-68b757865b-qzr8s\" (UID: \"62eb9c44-191a-44fd-8e28-330c344b9c94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qzr8s" Apr 23 08:56:12.316542 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.316517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4nk2\" (UniqueName: \"kubernetes.io/projected/62eb9c44-191a-44fd-8e28-330c344b9c94-kube-api-access-z4nk2\") pod \"cert-manager-cainjector-68b757865b-qzr8s\" (UID: \"62eb9c44-191a-44fd-8e28-330c344b9c94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qzr8s" Apr 23 08:56:12.316638 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.316517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62eb9c44-191a-44fd-8e28-330c344b9c94-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-qzr8s\" (UID: \"62eb9c44-191a-44fd-8e28-330c344b9c94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qzr8s" Apr 23 08:56:12.445517 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.445447 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-qzr8s" Apr 23 08:56:12.569327 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.569297 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-qzr8s"] Apr 23 08:56:12.577666 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:56:12.577640 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62eb9c44_191a_44fd_8e28_330c344b9c94.slice/crio-ba21a0bdabaf3ffd8d01f41568ca02fd2a8f18cb6093136b0b57a65b02fdad7e WatchSource:0}: Error finding container ba21a0bdabaf3ffd8d01f41568ca02fd2a8f18cb6093136b0b57a65b02fdad7e: Status 404 returned error can't find the container with id ba21a0bdabaf3ffd8d01f41568ca02fd2a8f18cb6093136b0b57a65b02fdad7e Apr 23 08:56:12.590939 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:12.590891 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-qzr8s" event={"ID":"62eb9c44-191a-44fd-8e28-330c344b9c94","Type":"ContainerStarted","Data":"ba21a0bdabaf3ffd8d01f41568ca02fd2a8f18cb6093136b0b57a65b02fdad7e"} Apr 23 08:56:15.601433 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:15.601396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-qzr8s" event={"ID":"62eb9c44-191a-44fd-8e28-330c344b9c94","Type":"ContainerStarted","Data":"0435ebfe2f660533f7a7638399a150b947b49ec8eef61b6d10fecbf17be25374"} Apr 23 08:56:15.619865 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:15.619815 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-qzr8s" podStartSLOduration=1.29978996 podStartE2EDuration="3.61980105s" podCreationTimestamp="2026-04-23 08:56:12 +0000 UTC" firstStartedPulling="2026-04-23 08:56:12.579578884 +0000 UTC m=+317.554886494" lastFinishedPulling="2026-04-23 08:56:14.899589979 +0000 UTC m=+319.874897584" observedRunningTime="2026-04-23 08:56:15.61855797 +0000 UTC m=+320.593865598" watchObservedRunningTime="2026-04-23 08:56:15.61980105 +0000 UTC m=+320.595108676" Apr 23 08:56:23.189598 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.189564 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26"] Apr 23 08:56:23.193136 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.193119 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:23.195723 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.195701 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 08:56:23.195819 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.195701 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-xlwpr\"" Apr 23 08:56:23.196813 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.196790 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 08:56:23.201507 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.201480 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26"] Apr 23 08:56:23.296613 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.296580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:23.296768 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.296631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:23.296768 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.296664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxxvd\" (UniqueName: \"kubernetes.io/projected/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-kube-api-access-fxxvd\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:23.397716 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.397684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:23.397878 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.397736 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:23.397878 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.397772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxxvd\" (UniqueName: \"kubernetes.io/projected/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-kube-api-access-fxxvd\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:23.398072 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.398051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:23.398166 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.398144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:23.407415 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.407394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxxvd\" (UniqueName: \"kubernetes.io/projected/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-kube-api-access-fxxvd\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:23.507250 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.507221 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:23.622467 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.622443 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26"] Apr 23 08:56:23.624933 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:56:23.624886 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74c91533_1dd1_4d90_9ad9_da7f8f5de04d.slice/crio-bfc8632cbc629840820a9fb9bf9379754614f654af72b1267bd0cb757c636803 WatchSource:0}: Error finding container bfc8632cbc629840820a9fb9bf9379754614f654af72b1267bd0cb757c636803: Status 404 returned error can't find the container with id bfc8632cbc629840820a9fb9bf9379754614f654af72b1267bd0cb757c636803 Apr 23 08:56:23.630227 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:23.630199 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" event={"ID":"74c91533-1dd1-4d90-9ad9-da7f8f5de04d","Type":"ContainerStarted","Data":"bfc8632cbc629840820a9fb9bf9379754614f654af72b1267bd0cb757c636803"} Apr 23 08:56:24.634912 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:24.634864 2575 generic.go:358] "Generic (PLEG): container finished" podID="74c91533-1dd1-4d90-9ad9-da7f8f5de04d" containerID="193e98f598c46e28ce66bdf12b77681ace27b8cc7a0b40adfb3e8623c2c4c0f2" exitCode=0 Apr 23 08:56:24.635268 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:24.634939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" event={"ID":"74c91533-1dd1-4d90-9ad9-da7f8f5de04d","Type":"ContainerDied","Data":"193e98f598c46e28ce66bdf12b77681ace27b8cc7a0b40adfb3e8623c2c4c0f2"} Apr 23 08:56:26.643812 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:26.643782 2575 generic.go:358] "Generic (PLEG): container finished" podID="74c91533-1dd1-4d90-9ad9-da7f8f5de04d" containerID="d681c05e97a863b19a1975c74c843fc27e982c16fbf6dcce05a3485c57668b5f" exitCode=0 Apr 23 08:56:26.644209 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:26.643862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" event={"ID":"74c91533-1dd1-4d90-9ad9-da7f8f5de04d","Type":"ContainerDied","Data":"d681c05e97a863b19a1975c74c843fc27e982c16fbf6dcce05a3485c57668b5f"} Apr 23 08:56:27.651670 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:27.651636 2575 generic.go:358] "Generic (PLEG): container finished" podID="74c91533-1dd1-4d90-9ad9-da7f8f5de04d" containerID="65d61fea8bd270228647bea1385deaf17401fd7538fbfca0eb6df02748cae4b0" exitCode=0 Apr 23 08:56:27.652126 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:27.651716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" event={"ID":"74c91533-1dd1-4d90-9ad9-da7f8f5de04d","Type":"ContainerDied","Data":"65d61fea8bd270228647bea1385deaf17401fd7538fbfca0eb6df02748cae4b0"} Apr 23 08:56:28.779923 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:28.776208 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:28.842994 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:28.842965 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-bundle\") pod \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " Apr 23 08:56:28.843155 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:28.843043 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-util\") pod \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " Apr 23 08:56:28.843155 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:28.843088 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxxvd\" (UniqueName: \"kubernetes.io/projected/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-kube-api-access-fxxvd\") pod \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\" (UID: \"74c91533-1dd1-4d90-9ad9-da7f8f5de04d\") " Apr 23 08:56:28.843394 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:28.843368 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-bundle" (OuterVolumeSpecName: "bundle") pod "74c91533-1dd1-4d90-9ad9-da7f8f5de04d" (UID: "74c91533-1dd1-4d90-9ad9-da7f8f5de04d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:56:28.845088 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:28.845065 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-kube-api-access-fxxvd" (OuterVolumeSpecName: "kube-api-access-fxxvd") pod "74c91533-1dd1-4d90-9ad9-da7f8f5de04d" (UID: "74c91533-1dd1-4d90-9ad9-da7f8f5de04d"). InnerVolumeSpecName "kube-api-access-fxxvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 08:56:28.848235 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:28.848207 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-util" (OuterVolumeSpecName: "util") pod "74c91533-1dd1-4d90-9ad9-da7f8f5de04d" (UID: "74c91533-1dd1-4d90-9ad9-da7f8f5de04d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 08:56:28.944014 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:28.943927 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-util\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:56:28.944014 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:28.943962 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fxxvd\" (UniqueName: \"kubernetes.io/projected/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-kube-api-access-fxxvd\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:56:28.944014 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:28.943987 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74c91533-1dd1-4d90-9ad9-da7f8f5de04d-bundle\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 08:56:29.659629 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:29.659598 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" event={"ID":"74c91533-1dd1-4d90-9ad9-da7f8f5de04d","Type":"ContainerDied","Data":"bfc8632cbc629840820a9fb9bf9379754614f654af72b1267bd0cb757c636803"} Apr 23 08:56:29.659629 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:29.659629 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfc8632cbc629840820a9fb9bf9379754614f654af72b1267bd0cb757c636803" Apr 23 08:56:29.659828 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:29.659644 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e57s26" Apr 23 08:56:34.699306 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.699269 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs"] Apr 23 08:56:34.699683 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.699567 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74c91533-1dd1-4d90-9ad9-da7f8f5de04d" containerName="pull" Apr 23 08:56:34.699683 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.699578 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c91533-1dd1-4d90-9ad9-da7f8f5de04d" containerName="pull" Apr 23 08:56:34.699683 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.699600 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74c91533-1dd1-4d90-9ad9-da7f8f5de04d" containerName="extract" Apr 23 08:56:34.699683 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.699607 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c91533-1dd1-4d90-9ad9-da7f8f5de04d" containerName="extract" Apr 23 08:56:34.699683 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.699617 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74c91533-1dd1-4d90-9ad9-da7f8f5de04d" containerName="util" Apr 23 08:56:34.699683 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.699623 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c91533-1dd1-4d90-9ad9-da7f8f5de04d" containerName="util" Apr 23 08:56:34.699683 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.699673 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="74c91533-1dd1-4d90-9ad9-da7f8f5de04d" containerName="extract" Apr 23 08:56:34.703701 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.703684 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs" Apr 23 08:56:34.706518 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.706496 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 23 08:56:34.706627 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.706554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-8mfpm\"" Apr 23 08:56:34.707745 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.707729 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 23 08:56:34.718078 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.718055 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs"] Apr 23 08:56:34.794953 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.794892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkcxs\" (UniqueName: \"kubernetes.io/projected/7a686653-22aa-44ee-9c21-f9458eaed2ab-kube-api-access-jkcxs\") pod \"jobset-operator-747c5859c7-ntzrs\" (UID: \"7a686653-22aa-44ee-9c21-f9458eaed2ab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs" Apr 23 08:56:34.795108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.795048 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a686653-22aa-44ee-9c21-f9458eaed2ab-tmp\") pod \"jobset-operator-747c5859c7-ntzrs\" (UID: \"7a686653-22aa-44ee-9c21-f9458eaed2ab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs" Apr 23 08:56:34.895718 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.895678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkcxs\" (UniqueName: \"kubernetes.io/projected/7a686653-22aa-44ee-9c21-f9458eaed2ab-kube-api-access-jkcxs\") pod \"jobset-operator-747c5859c7-ntzrs\" (UID: \"7a686653-22aa-44ee-9c21-f9458eaed2ab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs" Apr 23 08:56:34.895879 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.895738 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a686653-22aa-44ee-9c21-f9458eaed2ab-tmp\") pod \"jobset-operator-747c5859c7-ntzrs\" (UID: \"7a686653-22aa-44ee-9c21-f9458eaed2ab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs" Apr 23 08:56:34.896108 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.896093 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a686653-22aa-44ee-9c21-f9458eaed2ab-tmp\") pod \"jobset-operator-747c5859c7-ntzrs\" (UID: \"7a686653-22aa-44ee-9c21-f9458eaed2ab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs" Apr 23 08:56:34.904071 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:34.904044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkcxs\" (UniqueName: \"kubernetes.io/projected/7a686653-22aa-44ee-9c21-f9458eaed2ab-kube-api-access-jkcxs\") pod \"jobset-operator-747c5859c7-ntzrs\" (UID: \"7a686653-22aa-44ee-9c21-f9458eaed2ab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs" Apr 23 08:56:35.012647 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:35.012613 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs" Apr 23 08:56:35.130014 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:35.129987 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs"] Apr 23 08:56:35.132606 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:56:35.132568 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a686653_22aa_44ee_9c21_f9458eaed2ab.slice/crio-b6ea735cad619d9f564532ff6efdbe625e32728d564f25ecc8d878057479c3a4 WatchSource:0}: Error finding container b6ea735cad619d9f564532ff6efdbe625e32728d564f25ecc8d878057479c3a4: Status 404 returned error can't find the container with id b6ea735cad619d9f564532ff6efdbe625e32728d564f25ecc8d878057479c3a4 Apr 23 08:56:35.680084 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:35.680047 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs" event={"ID":"7a686653-22aa-44ee-9c21-f9458eaed2ab","Type":"ContainerStarted","Data":"b6ea735cad619d9f564532ff6efdbe625e32728d564f25ecc8d878057479c3a4"} Apr 23 08:56:37.690670 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:37.690639 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs" event={"ID":"7a686653-22aa-44ee-9c21-f9458eaed2ab","Type":"ContainerStarted","Data":"7c3b91ee54711b037bd15765f4113ecf8b962d2712de941c7587c319ae18606b"} Apr 23 08:56:37.707646 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:56:37.707590 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-ntzrs" podStartSLOduration=1.8209391209999999 podStartE2EDuration="3.707569771s" podCreationTimestamp="2026-04-23 08:56:34 +0000 UTC" firstStartedPulling="2026-04-23 08:56:35.134021235 +0000 UTC m=+340.109328843" lastFinishedPulling="2026-04-23 08:56:37.020651885 +0000 UTC m=+341.995959493" observedRunningTime="2026-04-23 08:56:37.706562845 +0000 UTC m=+342.681870474" watchObservedRunningTime="2026-04-23 08:56:37.707569771 +0000 UTC m=+342.682877399" Apr 23 08:57:03.373521 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.373480 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c"] Apr 23 08:57:03.379023 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.379001 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:03.381826 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.381793 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 23 08:57:03.381826 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.381803 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 23 08:57:03.382062 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.381823 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-zlvj5\"" Apr 23 08:57:03.383037 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.383015 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 23 08:57:03.383153 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.383015 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 23 08:57:03.386665 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.386645 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c"] Apr 23 08:57:03.432979 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.432947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64llz\" (UniqueName: \"kubernetes.io/projected/c0d80a46-4f3f-4918-903f-dd0634c6ab55-kube-api-access-64llz\") pod \"kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c\" (UID: \"c0d80a46-4f3f-4918-903f-dd0634c6ab55\") " pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:03.432979 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.432982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0d80a46-4f3f-4918-903f-dd0634c6ab55-cert\") pod \"kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c\" (UID: \"c0d80a46-4f3f-4918-903f-dd0634c6ab55\") " pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:03.433190 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.433016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/c0d80a46-4f3f-4918-903f-dd0634c6ab55-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c\" (UID: \"c0d80a46-4f3f-4918-903f-dd0634c6ab55\") " pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:03.534016 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.533980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64llz\" (UniqueName: \"kubernetes.io/projected/c0d80a46-4f3f-4918-903f-dd0634c6ab55-kube-api-access-64llz\") pod \"kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c\" (UID: \"c0d80a46-4f3f-4918-903f-dd0634c6ab55\") " pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:03.534016 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.534017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0d80a46-4f3f-4918-903f-dd0634c6ab55-cert\") pod \"kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c\" (UID: \"c0d80a46-4f3f-4918-903f-dd0634c6ab55\") " pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:03.534233 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.534050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/c0d80a46-4f3f-4918-903f-dd0634c6ab55-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c\" (UID: \"c0d80a46-4f3f-4918-903f-dd0634c6ab55\") " pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:03.534619 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.534600 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/c0d80a46-4f3f-4918-903f-dd0634c6ab55-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c\" (UID: \"c0d80a46-4f3f-4918-903f-dd0634c6ab55\") " pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:03.536448 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.536427 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c0d80a46-4f3f-4918-903f-dd0634c6ab55-cert\") pod \"kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c\" (UID: \"c0d80a46-4f3f-4918-903f-dd0634c6ab55\") " pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:03.543030 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.543004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64llz\" (UniqueName: \"kubernetes.io/projected/c0d80a46-4f3f-4918-903f-dd0634c6ab55-kube-api-access-64llz\") pod \"kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c\" (UID: \"c0d80a46-4f3f-4918-903f-dd0634c6ab55\") " pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:03.689665 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.689574 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:03.840000 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:03.839975 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c"] Apr 23 08:57:03.843203 ip-10-0-141-250 kubenswrapper[2575]: W0423 08:57:03.843173 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0d80a46_4f3f_4918_903f_dd0634c6ab55.slice/crio-6cf69465f46d6e2edf70dced856f0f52c42812e56b5969e0147510f9aec84817 WatchSource:0}: Error finding container 6cf69465f46d6e2edf70dced856f0f52c42812e56b5969e0147510f9aec84817: Status 404 returned error can't find the container with id 6cf69465f46d6e2edf70dced856f0f52c42812e56b5969e0147510f9aec84817 Apr 23 08:57:04.799436 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:04.798922 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" event={"ID":"c0d80a46-4f3f-4918-903f-dd0634c6ab55","Type":"ContainerStarted","Data":"6cf69465f46d6e2edf70dced856f0f52c42812e56b5969e0147510f9aec84817"} Apr 23 08:57:06.807988 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:06.807947 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" event={"ID":"c0d80a46-4f3f-4918-903f-dd0634c6ab55","Type":"ContainerStarted","Data":"8d8a81a21aa127e87966b62232698c3162b380b493e6e031a6b9e011e9d918e4"} Apr 23 08:57:06.808366 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:06.808017 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 08:57:06.830973 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:06.830911 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" podStartSLOduration=1.387699628 podStartE2EDuration="3.830874588s" podCreationTimestamp="2026-04-23 08:57:03 +0000 UTC" firstStartedPulling="2026-04-23 08:57:03.845129593 +0000 UTC m=+368.820437199" lastFinishedPulling="2026-04-23 08:57:06.288304552 +0000 UTC m=+371.263612159" observedRunningTime="2026-04-23 08:57:06.829477075 +0000 UTC m=+371.804784717" watchObservedRunningTime="2026-04-23 08:57:06.830874588 +0000 UTC m=+371.806182220" Apr 23 08:57:22.816576 ip-10-0-141-250 kubenswrapper[2575]: I0423 08:57:22.816546 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c" Apr 23 09:00:55.459514 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:00:55.459485 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/2.log" Apr 23 09:00:55.461241 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:00:55.461217 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/2.log" Apr 23 09:01:59.825646 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:01:59.825603 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf"] Apr 23 09:01:59.827854 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:01:59.827836 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" Apr 23 09:01:59.830495 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:01:59.830475 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"kube-root-ca.crt\"" Apr 23 09:01:59.830593 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:01:59.830521 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"default-dockercfg-42m4p\"" Apr 23 09:01:59.830593 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:01:59.830477 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-dv8pw\"/\"openshift-service-ca.crt\"" Apr 23 09:01:59.841083 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:01:59.841059 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf"] Apr 23 09:01:59.886976 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:01:59.886937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmv58\" (UniqueName: \"kubernetes.io/projected/5627ed47-7779-48e5-8cec-0915746afa94-kube-api-access-nmv58\") pod \"progression-custom-config-node-0-0-2crjf\" (UID: \"5627ed47-7779-48e5-8cec-0915746afa94\") " pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" Apr 23 09:01:59.987745 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:01:59.987709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmv58\" (UniqueName: \"kubernetes.io/projected/5627ed47-7779-48e5-8cec-0915746afa94-kube-api-access-nmv58\") pod \"progression-custom-config-node-0-0-2crjf\" (UID: \"5627ed47-7779-48e5-8cec-0915746afa94\") " pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" Apr 23 09:01:59.995714 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:01:59.995689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmv58\" (UniqueName: \"kubernetes.io/projected/5627ed47-7779-48e5-8cec-0915746afa94-kube-api-access-nmv58\") pod \"progression-custom-config-node-0-0-2crjf\" (UID: \"5627ed47-7779-48e5-8cec-0915746afa94\") " pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" Apr 23 09:02:00.137620 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:02:00.137525 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" Apr 23 09:02:00.260133 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:02:00.260082 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf"] Apr 23 09:02:00.262458 ip-10-0-141-250 kubenswrapper[2575]: W0423 09:02:00.262427 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5627ed47_7779_48e5_8cec_0915746afa94.slice/crio-5cd7108559cc3d314309341968b7ee063004ee26511df7230f66b8ccb6d4523f WatchSource:0}: Error finding container 5cd7108559cc3d314309341968b7ee063004ee26511df7230f66b8ccb6d4523f: Status 404 returned error can't find the container with id 5cd7108559cc3d314309341968b7ee063004ee26511df7230f66b8ccb6d4523f Apr 23 09:02:00.264564 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:02:00.264547 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:02:00.800887 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:02:00.800840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" event={"ID":"5627ed47-7779-48e5-8cec-0915746afa94","Type":"ContainerStarted","Data":"5cd7108559cc3d314309341968b7ee063004ee26511df7230f66b8ccb6d4523f"} Apr 23 09:03:47.202707 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:03:47.202611 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" event={"ID":"5627ed47-7779-48e5-8cec-0915746afa94","Type":"ContainerStarted","Data":"c1c074b2438aa9f48ba78fb4c405491cbd490122cadc259b0df2190416640453"} Apr 23 09:03:47.203369 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:03:47.202760 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" Apr 23 09:03:47.231093 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:03:47.231041 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" podStartSLOduration=1.56365372 podStartE2EDuration="1m48.231027283s" podCreationTimestamp="2026-04-23 09:01:59 +0000 UTC" firstStartedPulling="2026-04-23 09:02:00.264670319 +0000 UTC m=+665.239977926" lastFinishedPulling="2026-04-23 09:03:46.932043871 +0000 UTC m=+771.907351489" observedRunningTime="2026-04-23 09:03:47.229283591 +0000 UTC m=+772.204591219" watchObservedRunningTime="2026-04-23 09:03:47.231027283 +0000 UTC m=+772.206334910" Apr 23 09:03:49.208696 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:03:49.208660 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" Apr 23 09:04:10.206624 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:10.206577 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" podUID="5627ed47-7779-48e5-8cec-0915746afa94" containerName="node" probeResult="failure" output="Get \"http://10.134.0.29:28080/metrics\": dial tcp 10.134.0.29:28080: connect: connection refused" Apr 23 09:04:11.206666 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:11.206620 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" podUID="5627ed47-7779-48e5-8cec-0915746afa94" containerName="node" probeResult="failure" output="Get \"http://10.134.0.29:28080/metrics\": dial tcp 10.134.0.29:28080: connect: connection refused" Apr 23 09:04:11.207088 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:11.206764 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" Apr 23 09:04:11.207311 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:11.207281 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" podUID="5627ed47-7779-48e5-8cec-0915746afa94" containerName="node" probeResult="failure" output="Get \"http://10.134.0.29:28080/metrics\": dial tcp 10.134.0.29:28080: connect: connection refused" Apr 23 09:04:11.281970 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:11.281938 2575 generic.go:358] "Generic (PLEG): container finished" podID="5627ed47-7779-48e5-8cec-0915746afa94" containerID="c1c074b2438aa9f48ba78fb4c405491cbd490122cadc259b0df2190416640453" exitCode=0 Apr 23 09:04:11.282125 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:11.282006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" event={"ID":"5627ed47-7779-48e5-8cec-0915746afa94","Type":"ContainerDied","Data":"c1c074b2438aa9f48ba78fb4c405491cbd490122cadc259b0df2190416640453"} Apr 23 09:04:12.413964 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:12.413939 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" Apr 23 09:04:12.536123 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:12.536050 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmv58\" (UniqueName: \"kubernetes.io/projected/5627ed47-7779-48e5-8cec-0915746afa94-kube-api-access-nmv58\") pod \"5627ed47-7779-48e5-8cec-0915746afa94\" (UID: \"5627ed47-7779-48e5-8cec-0915746afa94\") " Apr 23 09:04:12.538156 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:12.538131 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5627ed47-7779-48e5-8cec-0915746afa94-kube-api-access-nmv58" (OuterVolumeSpecName: "kube-api-access-nmv58") pod "5627ed47-7779-48e5-8cec-0915746afa94" (UID: "5627ed47-7779-48e5-8cec-0915746afa94"). InnerVolumeSpecName "kube-api-access-nmv58". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:04:12.636910 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:12.636872 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmv58\" (UniqueName: \"kubernetes.io/projected/5627ed47-7779-48e5-8cec-0915746afa94-kube-api-access-nmv58\") on node \"ip-10-0-141-250.ec2.internal\" DevicePath \"\"" Apr 23 09:04:13.289489 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:13.289458 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" Apr 23 09:04:13.289650 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:13.289490 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf" event={"ID":"5627ed47-7779-48e5-8cec-0915746afa94","Type":"ContainerDied","Data":"5cd7108559cc3d314309341968b7ee063004ee26511df7230f66b8ccb6d4523f"} Apr 23 09:04:13.289650 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:13.289523 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cd7108559cc3d314309341968b7ee063004ee26511df7230f66b8ccb6d4523f" Apr 23 09:04:17.397332 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:17.397276 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf"] Apr 23 09:04:17.402258 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:17.402231 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-dv8pw/progression-custom-config-node-0-0-2crjf"] Apr 23 09:04:17.544743 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:17.544695 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5627ed47-7779-48e5-8cec-0915746afa94" path="/var/lib/kubelet/pods/5627ed47-7779-48e5-8cec-0915746afa94/volumes" Apr 23 09:04:28.031480 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:28.031449 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c_c0d80a46-4f3f-4918-903f-dd0634c6ab55/manager/0.log" Apr 23 09:04:28.483872 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:28.483839 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c_c0d80a46-4f3f-4918-903f-dd0634c6ab55/manager/0.log" Apr 23 09:04:28.951596 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:04:28.951500 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-86cf4d89c5-rsc2c_c0d80a46-4f3f-4918-903f-dd0634c6ab55/manager/0.log" Apr 23 09:05:05.295252 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.295207 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xsfl/must-gather-sxpv6"] Apr 23 09:05:05.295809 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.295636 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5627ed47-7779-48e5-8cec-0915746afa94" containerName="node" Apr 23 09:05:05.295809 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.295659 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5627ed47-7779-48e5-8cec-0915746afa94" containerName="node" Apr 23 09:05:05.295809 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.295766 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5627ed47-7779-48e5-8cec-0915746afa94" containerName="node" Apr 23 09:05:05.298999 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.298981 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xsfl/must-gather-sxpv6" Apr 23 09:05:05.301705 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.301681 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7xsfl\"/\"openshift-service-ca.crt\"" Apr 23 09:05:05.301814 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.301724 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7xsfl\"/\"default-dockercfg-x5hz6\"" Apr 23 09:05:05.302856 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.302839 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7xsfl\"/\"kube-root-ca.crt\"" Apr 23 09:05:05.308001 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.307979 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xsfl/must-gather-sxpv6"] Apr 23 09:05:05.406973 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.406930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/57f10b2e-ff5c-4698-a43e-cfc9edd64875-must-gather-output\") pod \"must-gather-sxpv6\" (UID: \"57f10b2e-ff5c-4698-a43e-cfc9edd64875\") " pod="openshift-must-gather-7xsfl/must-gather-sxpv6" Apr 23 09:05:05.407153 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.406990 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swdht\" (UniqueName: \"kubernetes.io/projected/57f10b2e-ff5c-4698-a43e-cfc9edd64875-kube-api-access-swdht\") pod \"must-gather-sxpv6\" (UID: \"57f10b2e-ff5c-4698-a43e-cfc9edd64875\") " pod="openshift-must-gather-7xsfl/must-gather-sxpv6" Apr 23 09:05:05.508208 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.508158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/57f10b2e-ff5c-4698-a43e-cfc9edd64875-must-gather-output\") pod \"must-gather-sxpv6\" (UID: \"57f10b2e-ff5c-4698-a43e-cfc9edd64875\") " pod="openshift-must-gather-7xsfl/must-gather-sxpv6" Apr 23 09:05:05.508414 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.508222 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swdht\" (UniqueName: \"kubernetes.io/projected/57f10b2e-ff5c-4698-a43e-cfc9edd64875-kube-api-access-swdht\") pod \"must-gather-sxpv6\" (UID: \"57f10b2e-ff5c-4698-a43e-cfc9edd64875\") " pod="openshift-must-gather-7xsfl/must-gather-sxpv6" Apr 23 09:05:05.508618 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.508595 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/57f10b2e-ff5c-4698-a43e-cfc9edd64875-must-gather-output\") pod \"must-gather-sxpv6\" (UID: \"57f10b2e-ff5c-4698-a43e-cfc9edd64875\") " pod="openshift-must-gather-7xsfl/must-gather-sxpv6" Apr 23 09:05:05.517085 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.517054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swdht\" (UniqueName: \"kubernetes.io/projected/57f10b2e-ff5c-4698-a43e-cfc9edd64875-kube-api-access-swdht\") pod \"must-gather-sxpv6\" (UID: \"57f10b2e-ff5c-4698-a43e-cfc9edd64875\") " pod="openshift-must-gather-7xsfl/must-gather-sxpv6" Apr 23 09:05:05.609195 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.609086 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xsfl/must-gather-sxpv6" Apr 23 09:05:05.737171 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:05.737144 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xsfl/must-gather-sxpv6"] Apr 23 09:05:05.739341 ip-10-0-141-250 kubenswrapper[2575]: W0423 09:05:05.739308 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57f10b2e_ff5c_4698_a43e_cfc9edd64875.slice/crio-48253395146c3c4ede8dddaa69aa6e525d401a63308007191c4da7789feb7be4 WatchSource:0}: Error finding container 48253395146c3c4ede8dddaa69aa6e525d401a63308007191c4da7789feb7be4: Status 404 returned error can't find the container with id 48253395146c3c4ede8dddaa69aa6e525d401a63308007191c4da7789feb7be4 Apr 23 09:05:06.481381 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:06.481343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xsfl/must-gather-sxpv6" event={"ID":"57f10b2e-ff5c-4698-a43e-cfc9edd64875","Type":"ContainerStarted","Data":"48253395146c3c4ede8dddaa69aa6e525d401a63308007191c4da7789feb7be4"} Apr 23 09:05:07.489970 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:07.489923 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xsfl/must-gather-sxpv6" event={"ID":"57f10b2e-ff5c-4698-a43e-cfc9edd64875","Type":"ContainerStarted","Data":"5160541158ef42b47bd9bcdcd26a939e30bd5a8004e58028c2820fc4a3ee0aac"} Apr 23 09:05:07.489970 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:07.489974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xsfl/must-gather-sxpv6" event={"ID":"57f10b2e-ff5c-4698-a43e-cfc9edd64875","Type":"ContainerStarted","Data":"96e2427f7da8c3f70850d39f2160ade5588b5f414cd50480d6ddb98f7e962c7d"} Apr 23 09:05:07.507434 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:07.507370 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7xsfl/must-gather-sxpv6" podStartSLOduration=1.4129434189999999 podStartE2EDuration="2.507352021s" podCreationTimestamp="2026-04-23 09:05:05 +0000 UTC" firstStartedPulling="2026-04-23 09:05:05.741101181 +0000 UTC m=+850.716408787" lastFinishedPulling="2026-04-23 09:05:06.83550978 +0000 UTC m=+851.810817389" observedRunningTime="2026-04-23 09:05:07.506965135 +0000 UTC m=+852.482272764" watchObservedRunningTime="2026-04-23 09:05:07.507352021 +0000 UTC m=+852.482659650" Apr 23 09:05:08.204826 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:08.204792 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qb26v_f6bb4535-9d07-4788-b02b-2c58c53b4191/global-pull-secret-syncer/0.log" Apr 23 09:05:08.297456 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:08.297427 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2bjvt_d13aed77-74bb-4ef7-a5d0-ae7948dc8568/konnectivity-agent/0.log" Apr 23 09:05:08.416792 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:08.416747 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-250.ec2.internal_4720a5d3bfeaf0855890af875d341e36/haproxy/0.log" Apr 23 09:05:11.497662 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:11.497626 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_64d31fe5-d470-40ef-ac7c-e06d9804bc3b/alertmanager/0.log" Apr 23 09:05:11.523265 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:11.523236 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_64d31fe5-d470-40ef-ac7c-e06d9804bc3b/config-reloader/0.log" Apr 23 09:05:11.549183 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:11.549153 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_64d31fe5-d470-40ef-ac7c-e06d9804bc3b/kube-rbac-proxy-web/0.log" Apr 23 09:05:11.569964 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:11.569936 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_64d31fe5-d470-40ef-ac7c-e06d9804bc3b/kube-rbac-proxy/0.log" Apr 23 09:05:11.594826 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:11.594756 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_64d31fe5-d470-40ef-ac7c-e06d9804bc3b/kube-rbac-proxy-metric/0.log" Apr 23 09:05:11.614210 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:11.614181 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_64d31fe5-d470-40ef-ac7c-e06d9804bc3b/prom-label-proxy/0.log" Apr 23 09:05:11.637286 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:11.637257 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_64d31fe5-d470-40ef-ac7c-e06d9804bc3b/init-config-reloader/0.log" Apr 23 09:05:11.759093 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:11.758999 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6597794cf6-jxhvl_222c8c75-9350-46dc-9088-28d00d4e6b2a/metrics-server/0.log" Apr 23 09:05:11.945007 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:11.944976 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x9b4d_1d8f119e-e62b-482e-b8b2-d61c14023d7f/node-exporter/0.log" Apr 23 09:05:11.965226 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:11.965195 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x9b4d_1d8f119e-e62b-482e-b8b2-d61c14023d7f/kube-rbac-proxy/0.log" Apr 23 09:05:11.985470 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:11.985436 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x9b4d_1d8f119e-e62b-482e-b8b2-d61c14023d7f/init-textfile/0.log" Apr 23 09:05:12.269233 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:12.269189 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79b6cb47bb-xxqfk_c97cf881-9c57-4f1a-a261-ae0ff786ad82/telemeter-client/0.log" Apr 23 09:05:12.290516 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:12.290480 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79b6cb47bb-xxqfk_c97cf881-9c57-4f1a-a261-ae0ff786ad82/reload/0.log" Apr 23 09:05:12.311235 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:12.311198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79b6cb47bb-xxqfk_c97cf881-9c57-4f1a-a261-ae0ff786ad82/kube-rbac-proxy/0.log" Apr 23 09:05:13.562692 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:13.562653 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-25tqt_8133d4a4-92a8-44ef-a085-59ed02873e69/networking-console-plugin/0.log" Apr 23 09:05:13.974247 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:13.974166 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/2.log" Apr 23 09:05:13.983212 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:13.983177 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-dm56t_b8ea5f2d-a09a-4865-8f65-103aa49ba68c/console-operator/3.log" Apr 23 09:05:14.360678 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:14.360641 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-ddgns_1762b05e-88c6-410f-99cf-cbd73bd4ca6e/download-server/0.log" Apr 23 09:05:14.713526 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:14.713447 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-rmh2t_fd81dbd9-73c8-4e7d-86c3-e33a7bae662d/volume-data-source-validator/0.log" Apr 23 09:05:14.978485 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:14.978394 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh"] Apr 23 09:05:14.983240 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:14.983211 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:14.992578 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:14.992386 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh"] Apr 23 09:05:15.036150 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.036098 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppk6m\" (UniqueName: \"kubernetes.io/projected/df1fa711-afc3-46bb-8851-93cb728065ec-kube-api-access-ppk6m\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.036345 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.036197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-lib-modules\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.036345 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.036236 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-sys\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.036345 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.036312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-proc\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.036345 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.036334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-podres\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.137299 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.137259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-sys\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.137531 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.137320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-proc\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.137531 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.137340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-podres\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.137531 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.137370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppk6m\" (UniqueName: \"kubernetes.io/projected/df1fa711-afc3-46bb-8851-93cb728065ec-kube-api-access-ppk6m\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.137531 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.137396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-sys\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.137531 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.137460 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-proc\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.137531 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.137507 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-lib-modules\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.137531 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.137517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-podres\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.137837 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.137644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df1fa711-afc3-46bb-8851-93cb728065ec-lib-modules\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.145009 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.144981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppk6m\" (UniqueName: \"kubernetes.io/projected/df1fa711-afc3-46bb-8851-93cb728065ec-kube-api-access-ppk6m\") pod \"perf-node-gather-daemonset-9tghh\" (UID: \"df1fa711-afc3-46bb-8851-93cb728065ec\") " pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.297562 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.297530 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.393382 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.393347 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qm6xv_f3c57c70-2bd6-42fa-9ece-35b56e75a778/dns/0.log" Apr 23 09:05:15.416227 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.416198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qm6xv_f3c57c70-2bd6-42fa-9ece-35b56e75a778/kube-rbac-proxy/0.log" Apr 23 09:05:15.428022 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.428000 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh"] Apr 23 09:05:15.430721 ip-10-0-141-250 kubenswrapper[2575]: W0423 09:05:15.430692 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddf1fa711_afc3_46bb_8851_93cb728065ec.slice/crio-139668d8b4337fc4613c4063c6ce62da86deadc2a9fe2ca5be4bce1025ca0c41 WatchSource:0}: Error finding container 139668d8b4337fc4613c4063c6ce62da86deadc2a9fe2ca5be4bce1025ca0c41: Status 404 returned error can't find the container with id 139668d8b4337fc4613c4063c6ce62da86deadc2a9fe2ca5be4bce1025ca0c41 Apr 23 09:05:15.440170 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.440143 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5lf2l_4e667463-9112-48df-b2c9-8ff9e9415bce/dns-node-resolver/0.log" Apr 23 09:05:15.528690 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.528653 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" event={"ID":"df1fa711-afc3-46bb-8851-93cb728065ec","Type":"ContainerStarted","Data":"821b04352b8b2f3a7ce963984b75bdf2508486948fc381c6992ffa888ac3b69a"} Apr 23 09:05:15.528690 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.528694 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" event={"ID":"df1fa711-afc3-46bb-8851-93cb728065ec","Type":"ContainerStarted","Data":"139668d8b4337fc4613c4063c6ce62da86deadc2a9fe2ca5be4bce1025ca0c41"} Apr 23 09:05:15.528945 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.528729 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:15.548872 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.548757 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" podStartSLOduration=1.5487402719999999 podStartE2EDuration="1.548740272s" podCreationTimestamp="2026-04-23 09:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:05:15.546663109 +0000 UTC m=+860.521970737" watchObservedRunningTime="2026-04-23 09:05:15.548740272 +0000 UTC m=+860.524047902" Apr 23 09:05:15.868013 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.867879 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7d564d9886-5ch26_d9c40a04-6569-440b-a7a3-24f158bed60b/registry/0.log" Apr 23 09:05:15.885832 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:15.885791 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-56m8r_c3c1faf4-8a9e-479e-ac99-dbded210df17/node-ca/0.log" Apr 23 09:05:16.570462 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:16.570429 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6c9bfcb758-sm7fz_2f8448c9-413a-4f5b-8f76-80f73e69d72f/router/0.log" Apr 23 09:05:16.877598 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:16.877486 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qfjzv_fbb544b6-122a-4e2a-9835-e970e273e58b/serve-healthcheck-canary/0.log" Apr 23 09:05:17.251133 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:17.251097 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-nsgdp_ddc13db8-46f8-47be-b720-51cd59fd933a/insights-operator/1.log" Apr 23 09:05:17.251350 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:17.251250 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-nsgdp_ddc13db8-46f8-47be-b720-51cd59fd933a/insights-operator/0.log" Apr 23 09:05:17.329294 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:17.329259 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bcg62_b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c/kube-rbac-proxy/0.log" Apr 23 09:05:17.347589 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:17.347563 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bcg62_b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c/exporter/0.log" Apr 23 09:05:17.369041 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:17.369011 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bcg62_b0b74bfc-0c88-4e90-864f-9d7bdf5ed72c/extractor/0.log" Apr 23 09:05:18.975372 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:18.975342 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-ntzrs_7a686653-22aa-44ee-9c21-f9458eaed2ab/jobset-operator/0.log" Apr 23 09:05:21.546128 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:21.546103 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7xsfl/perf-node-gather-daemonset-9tghh" Apr 23 09:05:21.918115 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:21.918035 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-smbx4_9bb76e15-3c64-4595-835e-3e58cb47ed46/migrator/0.log" Apr 23 09:05:21.936705 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:21.936674 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-smbx4_9bb76e15-3c64-4595-835e-3e58cb47ed46/graceful-termination/0.log" Apr 23 09:05:23.376059 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:23.375977 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c6cqt_35e62b41-5dc1-4f18-a2d1-a4c01ace11a3/kube-multus-additional-cni-plugins/0.log" Apr 23 09:05:23.397831 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:23.397796 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c6cqt_35e62b41-5dc1-4f18-a2d1-a4c01ace11a3/egress-router-binary-copy/0.log" Apr 23 09:05:23.419259 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:23.419235 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c6cqt_35e62b41-5dc1-4f18-a2d1-a4c01ace11a3/cni-plugins/0.log" Apr 23 09:05:23.442289 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:23.442263 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c6cqt_35e62b41-5dc1-4f18-a2d1-a4c01ace11a3/bond-cni-plugin/0.log" Apr 23 09:05:23.465604 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:23.465570 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c6cqt_35e62b41-5dc1-4f18-a2d1-a4c01ace11a3/routeoverride-cni/0.log" Apr 23 09:05:23.490503 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:23.490475 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c6cqt_35e62b41-5dc1-4f18-a2d1-a4c01ace11a3/whereabouts-cni-bincopy/0.log" Apr 23 09:05:23.517862 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:23.517827 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c6cqt_35e62b41-5dc1-4f18-a2d1-a4c01ace11a3/whereabouts-cni/0.log" Apr 23 09:05:23.702681 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:23.702602 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rt965_0da171ba-bef9-4402-936e-2d5afc07a732/kube-multus/0.log" Apr 23 09:05:23.721408 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:23.721373 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9tmnv_c32e908b-8a1f-4d28-99e1-dce39209186a/network-metrics-daemon/0.log" Apr 23 09:05:23.737134 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:23.737098 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9tmnv_c32e908b-8a1f-4d28-99e1-dce39209186a/kube-rbac-proxy/0.log" Apr 23 09:05:24.829177 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:24.829086 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jtcqz_d1ed91cc-5386-4f96-93e5-b81a3c676537/ovn-controller/0.log" Apr 23 09:05:24.854048 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:24.854019 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jtcqz_d1ed91cc-5386-4f96-93e5-b81a3c676537/ovn-acl-logging/0.log" Apr 23 09:05:24.876832 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:24.876803 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jtcqz_d1ed91cc-5386-4f96-93e5-b81a3c676537/kube-rbac-proxy-node/0.log" Apr 23 09:05:24.897089 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:24.897041 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jtcqz_d1ed91cc-5386-4f96-93e5-b81a3c676537/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:05:24.913693 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:24.913662 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jtcqz_d1ed91cc-5386-4f96-93e5-b81a3c676537/northd/0.log" Apr 23 09:05:24.932857 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:24.932828 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jtcqz_d1ed91cc-5386-4f96-93e5-b81a3c676537/nbdb/0.log" Apr 23 09:05:24.953236 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:24.953212 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jtcqz_d1ed91cc-5386-4f96-93e5-b81a3c676537/sbdb/0.log" Apr 23 09:05:25.123113 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:25.123022 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jtcqz_d1ed91cc-5386-4f96-93e5-b81a3c676537/ovnkube-controller/0.log" Apr 23 09:05:26.383416 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:26.383386 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-kl2k9_81dd2f7c-f618-4c84-81fd-ff2be1c08dc3/network-check-target-container/0.log" Apr 23 09:05:27.258847 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:27.258816 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-kk6q7_e1c362ed-687e-4a36-bd04-7adb2e7cbf8b/iptables-alerter/0.log" Apr 23 09:05:27.874479 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:27.874441 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-tm68n_91b45c99-408d-4541-b831-3c2a3f9ba542/tuned/0.log" Apr 23 09:05:29.478328 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:29.478295 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-w4vpz_f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6/cluster-samples-operator/0.log" Apr 23 09:05:29.493231 ip-10-0-141-250 kubenswrapper[2575]: I0423 09:05:29.493199 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-w4vpz_f1eafad1-6aa8-49a0-a234-8f8cedbbb3d6/cluster-samples-operator-watch/0.log"