Apr 22 18:40:07.546208 ip-10-0-130-32 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:40:07.546219 ip-10-0-130-32 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:40:07.546226 ip-10-0-130-32 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:40:07.546442 ip-10-0-130-32 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:40:17.677402 ip-10-0-130-32 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:40:17.677421 ip-10-0-130-32 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot c98d5f78bb3d456c89091ad4d4f43394 -- Apr 22 18:42:43.673038 ip-10-0-130-32 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:42:43.982928 ip-10-0-130-32 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:42:43.982928 ip-10-0-130-32 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:42:43.982928 ip-10-0-130-32 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:42:43.982928 ip-10-0-130-32 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:42:43.982928 ip-10-0-130-32 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:42:43.984702 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.984622 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:42:43.989422 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989399 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:43.989422 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989421 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:43.989422 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989426 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989430 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989435 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989438 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989441 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989444 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989447 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989449 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989452 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989455 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989458 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989460 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989463 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989465 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989468 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989471 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989474 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989476 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989479 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989482 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:43.989565 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989484 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989487 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989489 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989492 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989495 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989497 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989500 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989502 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989505 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989507 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989511 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989514 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989516 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989519 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989522 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989526 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989530 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989535 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989538 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989541 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:43.990065 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989544 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989547 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989549 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989552 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989555 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989558 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989560 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989563 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989565 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989568 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989571 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989574 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989577 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989580 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989582 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989585 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989587 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989590 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989592 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:43.990598 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989595 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989598 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989600 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989603 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989605 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989608 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989610 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989614 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989617 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989620 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989624 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989627 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989630 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989632 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989635 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989637 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989640 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989644 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989646 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:43.991073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989649 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989651 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989655 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989658 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989660 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.989663 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990065 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990071 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990074 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990077 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990080 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990082 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990085 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990088 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990091 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990093 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990096 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990099 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990103 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:43.991530 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990106 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990110 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990113 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990116 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990118 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990121 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990124 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990126 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990129 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990132 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990134 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990137 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990140 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990142 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990145 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990148 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990151 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990153 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990156 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990158 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:43.992012 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990161 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990163 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990166 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990168 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990170 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990173 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990175 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990178 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990180 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990183 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990185 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990188 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990190 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990194 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990197 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990199 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990202 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990204 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990207 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990209 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:43.992514 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990212 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990214 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990217 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990220 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990222 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990225 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990227 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990230 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990233 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990236 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990238 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990241 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990243 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990246 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990248 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990251 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990254 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990256 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990259 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990261 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:43.993054 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990263 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990266 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990269 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990271 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990274 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990277 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990279 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990283 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990287 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990290 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990294 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990297 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.990300 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991359 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991368 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991376 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991381 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991391 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991395 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991399 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991403 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:42:43.993539 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991407 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991410 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991413 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991416 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991419 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991422 2575 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991425 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991428 2575 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991431 2575 flags.go:64] FLAG: --cloud-config="" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991434 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991437 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991441 2575 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991443 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991446 2575 flags.go:64] FLAG: --config-dir="" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991449 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991452 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991456 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991459 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991463 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991466 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991469 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991472 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991475 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991478 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991481 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:42:43.994068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991485 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991487 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991490 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991493 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991497 2575 flags.go:64] FLAG: --enable-server="true" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991499 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991504 2575 flags.go:64] FLAG: --event-burst="100" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991508 2575 flags.go:64] FLAG: --event-qps="50" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991510 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991513 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991516 2575 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991520 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991523 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991526 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991529 2575 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991532 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991535 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991538 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991541 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991544 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991547 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991550 2575 flags.go:64] FLAG: --feature-gates="" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991554 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991557 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991560 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:42:43.994672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991564 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991567 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991570 2575 flags.go:64] FLAG: --help="false" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991575 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-130-32.ec2.internal" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991578 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991581 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991584 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991587 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991591 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991594 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991597 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991599 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991602 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991606 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991609 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991612 2575 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991614 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991617 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991620 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991623 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991626 2575 flags.go:64] FLAG: --lock-file="" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991629 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991632 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991635 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:42:43.995284 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991640 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991643 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991646 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991648 2575 flags.go:64] FLAG: --logging-format="text" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991651 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991655 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991658 2575 flags.go:64] FLAG: --manifest-url="" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991661 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991665 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991668 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991672 2575 flags.go:64] FLAG: --max-pods="110" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991676 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991679 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991682 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991685 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991688 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991690 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991693 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991700 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991703 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991706 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991710 2575 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991713 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:42:43.995893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991718 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991721 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991724 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991726 2575 flags.go:64] FLAG: --port="10250" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991729 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991732 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c846cea89daa078b" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991736 2575 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991739 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991742 2575 flags.go:64] FLAG: --register-node="true" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991744 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991747 2575 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991751 2575 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991753 2575 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991756 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991759 2575 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991763 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991766 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991784 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991790 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991795 2575 flags.go:64] FLAG: --runonce="false" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991800 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991803 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991806 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991809 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991811 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991815 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:42:43.996453 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991818 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991821 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991823 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991826 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991829 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991832 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991836 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991838 2575 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991841 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991847 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991850 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991852 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991856 2575 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991859 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991861 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991864 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991867 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991870 2575 flags.go:64] FLAG: --v="2" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991874 2575 flags.go:64] FLAG: --version="false" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991884 2575 flags.go:64] FLAG: --vmodule="" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991890 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.991894 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.991984 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.991988 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:43.997163 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.991991 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.991994 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.991999 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992002 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992004 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992007 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992010 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992013 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992016 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992019 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992021 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992025 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992028 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992031 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992034 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992037 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992039 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992042 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992045 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992047 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:43.997759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992050 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992052 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992055 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992057 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992060 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992064 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992067 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992071 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992074 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992076 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992079 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992082 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992085 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992087 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992091 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992093 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992096 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992098 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992101 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:43.998319 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992103 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992106 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992109 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992111 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992114 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992116 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992118 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992121 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992124 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992127 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992129 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992132 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992134 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992137 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992139 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992141 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992144 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992146 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992149 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:43.998805 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992151 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992154 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992156 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992162 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992164 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992167 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992170 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992172 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992176 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992178 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992181 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992184 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992186 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992189 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992191 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992194 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992196 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992199 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992201 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992203 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:43.999280 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992206 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992208 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992211 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992214 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992217 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.992219 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.992682 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.999182 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.999289 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999335 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999340 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999344 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999347 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999350 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999353 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:43.999794 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999356 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999358 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999361 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999363 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999366 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999369 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999373 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999375 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999378 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999381 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999384 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999386 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999389 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999391 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999394 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999398 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999403 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999406 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999409 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999412 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:44.000189 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999415 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999417 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999420 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999423 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999425 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999428 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999433 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999436 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999439 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999442 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999445 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999447 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999450 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999453 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999455 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999458 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999460 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999463 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999466 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999468 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:44.000682 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999471 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999473 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999476 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999478 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999481 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999483 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999486 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999488 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999491 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999493 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999496 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999498 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999502 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999505 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999508 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999510 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999513 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999516 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999518 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999521 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:44.001257 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999524 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999526 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999529 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999531 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999534 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999537 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999539 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999542 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999544 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999547 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999549 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999552 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999555 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999558 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999560 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999563 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999565 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999568 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999570 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:44.001737 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999573 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.999578 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999672 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999677 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999680 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999683 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999686 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999690 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999693 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999696 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999698 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999701 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999703 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999706 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999709 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999712 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:44.002237 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999714 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999717 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999719 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999722 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999725 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999727 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999731 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999735 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999738 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999741 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999743 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999746 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999749 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999752 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999755 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999758 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999760 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999763 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999765 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:44.002640 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999768 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999788 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999792 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999795 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999799 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999802 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999806 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999808 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999811 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999813 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999816 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999819 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999821 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999824 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999826 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999829 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999831 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999834 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999836 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999839 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:44.003110 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999841 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999844 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999846 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999849 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999852 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999856 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999858 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999861 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999864 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999866 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999869 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999871 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999874 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999877 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999887 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999890 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999893 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999896 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999899 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:44.003597 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999902 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999904 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999907 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999909 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999912 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999915 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999917 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999919 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999922 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999925 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999927 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999929 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999932 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:43.999934 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:43.999940 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:42:44.004073 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.000539 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:42:44.004444 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.003336 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:42:44.004444 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.004098 2575 server.go:1019] "Starting client certificate rotation" Apr 22 18:42:44.004444 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.004193 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:42:44.004444 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.004223 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:42:44.021872 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.021855 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:42:44.025732 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.025715 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:42:44.040222 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.040185 2575 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:42:44.044612 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.044597 2575 log.go:25] "Validated CRI v1 image API" Apr 22 18:42:44.045724 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.045709 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:42:44.051052 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.051033 2575 fs.go:135] Filesystem UUIDs: map[30ec44cb-d592-48d6-9f37-936ee2088814:/dev/nvme0n1p3 72f795c3-7816-4569-ad83-a0f2a494b841:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 18:42:44.051124 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.051051 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:42:44.052594 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.052578 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:42:44.056720 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.056616 2575 manager.go:217] Machine: {Timestamp:2026-04-22 18:42:44.055110715 +0000 UTC m=+0.294743069 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3094979 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b6d5673ba46a5ab0ca2ff6c3be845 SystemUUID:ec2b6d56-73ba-46a5-ab0c-a2ff6c3be845 BootID:c98d5f78-bb3d-456c-8909-1ad4d4f43394 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:82:f5:f1:ab:b3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:82:f5:f1:ab:b3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4e:ef:26:f1:fc:94 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:42:44.057063 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.057053 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:42:44.057184 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.057172 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:42:44.058058 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.058035 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:42:44.058190 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.058061 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-32.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:42:44.058232 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.058216 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:42:44.058232 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.058225 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:42:44.058290 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.058237 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:42:44.059422 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.059412 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:42:44.060600 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.060590 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:42:44.060707 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.060697 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:42:44.062466 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.062457 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:42:44.062516 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.062470 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:42:44.062516 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.062484 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:42:44.062516 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.062493 2575 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:42:44.062516 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.062501 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:42:44.063346 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.063333 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:42:44.063387 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.063352 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:42:44.065927 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.065913 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:42:44.067133 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.067120 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:42:44.068486 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068474 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:42:44.068522 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068493 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:42:44.068522 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068502 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:42:44.068522 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068510 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:42:44.068522 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068517 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:42:44.068522 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068523 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:42:44.068653 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068530 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:42:44.068653 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068536 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:42:44.068653 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068543 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:42:44.068653 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068549 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:42:44.068653 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068569 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:42:44.068653 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.068581 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:42:44.069856 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.069843 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:42:44.069856 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.069858 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:42:44.072138 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.072121 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-32.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:42:44.072206 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.072153 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-32.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:42:44.072206 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.072154 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:42:44.073366 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.073354 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:42:44.073411 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.073390 2575 server.go:1295] "Started kubelet" Apr 22 18:42:44.073497 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.073473 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:42:44.073532 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.073490 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:42:44.073563 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.073552 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:42:44.074139 ip-10-0-130-32 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:42:44.074444 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.074422 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:42:44.075337 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.075323 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:42:44.077814 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.077140 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-32.ec2.internal.18a8c2015ae4e835 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-32.ec2.internal,UID:ip-10-0-130-32.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-32.ec2.internal,},FirstTimestamp:2026-04-22 18:42:44.073367605 +0000 UTC m=+0.312999957,LastTimestamp:2026-04-22 18:42:44.073367605 +0000 UTC m=+0.312999957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-32.ec2.internal,}" Apr 22 18:42:44.079706 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.079687 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:42:44.080065 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.080020 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:42:44.080727 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.080704 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:42:44.080823 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.080711 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:42:44.080823 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.080745 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:42:44.080823 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.080816 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:42:44.080969 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.080827 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:42:44.081015 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.081006 2575 factory.go:55] Registering systemd factory Apr 22 18:42:44.081060 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.081020 2575 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:42:44.081386 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.081373 2575 factory.go:153] Registering CRI-O factory Apr 22 18:42:44.081386 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.081388 2575 factory.go:223] Registration of the crio container factory successfully Apr 22 18:42:44.081518 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.081435 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:42:44.081518 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.081477 2575 factory.go:103] Registering Raw factory Apr 22 18:42:44.081518 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.081503 2575 manager.go:1196] Started watching for new ooms in manager Apr 22 18:42:44.082146 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.082129 2575 manager.go:319] Starting recovery of all containers Apr 22 18:42:44.082209 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.082175 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:44.082674 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.082650 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-32.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:42:44.082674 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.082652 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:42:44.082855 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.082821 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9kfsw" Apr 22 18:42:44.084457 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.084436 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:42:44.092333 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.092318 2575 manager.go:324] Recovery completed Apr 22 18:42:44.095807 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.095790 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9kfsw" Apr 22 18:42:44.097825 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.097806 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:44.100149 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.100134 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:44.100219 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.100163 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:44.100219 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.100174 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:44.100632 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.100617 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:42:44.100632 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.100628 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:42:44.100760 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.100643 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:42:44.103729 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.103715 2575 policy_none.go:49] "None policy: Start" Apr 22 18:42:44.103812 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.103732 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:42:44.103812 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.103744 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:42:44.155464 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.137304 2575 manager.go:341] "Starting Device Plugin manager" Apr 22 18:42:44.155464 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.137334 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:42:44.155464 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.137344 2575 server.go:85] "Starting device plugin registration server" Apr 22 18:42:44.155464 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.137545 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:42:44.155464 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.137971 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:42:44.155464 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.138069 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:42:44.155464 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.138126 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:42:44.155464 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.138135 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:42:44.155464 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.138794 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:42:44.155464 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.138820 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:44.187974 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.187951 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:42:44.189167 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.189150 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:42:44.189167 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.189171 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:42:44.189287 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.189184 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:42:44.189287 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.189192 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:42:44.189287 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.189261 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:42:44.191544 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.191526 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:44.239285 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.239257 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:44.240029 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.240012 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:44.240107 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.240042 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:44.240107 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.240056 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:44.240107 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.240081 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.248512 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.248470 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.248512 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.248489 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-32.ec2.internal\": node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:44.266671 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.266647 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:44.289361 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.289340 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal"] Apr 22 18:42:44.289430 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.289395 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:44.290564 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.290540 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:44.290564 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.290563 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:44.290662 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.290573 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:44.292899 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.292887 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:44.293033 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.293019 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.293087 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.293049 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:44.293522 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.293506 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:44.293585 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.293534 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:44.293585 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.293510 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:44.293585 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.293564 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:44.293585 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.293577 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:44.293762 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.293543 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:44.295858 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.295843 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.295953 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.295864 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:44.296459 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.296446 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:44.296528 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.296468 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:44.296528 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.296478 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:44.320433 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.320413 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-32.ec2.internal\" not found" node="ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.323722 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.323702 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-32.ec2.internal\" not found" node="ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.367575 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.367557 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:44.382435 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.382415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77d0c29eeab8147de5aeed09c8b86101-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal\" (UID: \"77d0c29eeab8147de5aeed09c8b86101\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.382488 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.382440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77d0c29eeab8147de5aeed09c8b86101-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal\" (UID: \"77d0c29eeab8147de5aeed09c8b86101\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.382488 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.382457 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/150594baeab35cb17ccfa66548a34222-config\") pod \"kube-apiserver-proxy-ip-10-0-130-32.ec2.internal\" (UID: \"150594baeab35cb17ccfa66548a34222\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.467667 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.467632 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:44.483123 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.483099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77d0c29eeab8147de5aeed09c8b86101-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal\" (UID: \"77d0c29eeab8147de5aeed09c8b86101\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.483209 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.483125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77d0c29eeab8147de5aeed09c8b86101-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal\" (UID: \"77d0c29eeab8147de5aeed09c8b86101\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.483209 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.483173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77d0c29eeab8147de5aeed09c8b86101-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal\" (UID: \"77d0c29eeab8147de5aeed09c8b86101\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.483209 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.483189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/150594baeab35cb17ccfa66548a34222-config\") pod \"kube-apiserver-proxy-ip-10-0-130-32.ec2.internal\" (UID: \"150594baeab35cb17ccfa66548a34222\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.483312 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.483222 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/150594baeab35cb17ccfa66548a34222-config\") pod \"kube-apiserver-proxy-ip-10-0-130-32.ec2.internal\" (UID: \"150594baeab35cb17ccfa66548a34222\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.483312 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.483250 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77d0c29eeab8147de5aeed09c8b86101-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal\" (UID: \"77d0c29eeab8147de5aeed09c8b86101\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.568546 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.568467 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:44.622010 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.621989 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.626403 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.626386 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal" Apr 22 18:42:44.668850 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.668824 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:44.769312 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.769289 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:44.869898 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.869835 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:44.954886 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:44.954858 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:44.970803 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:44.970781 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:45.004306 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.004287 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:42:45.004901 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.004404 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:42:45.004901 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.004450 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:42:45.065822 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.065798 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:45.071679 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:45.071651 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:45.080944 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.080922 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:42:45.098208 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.098177 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:37:44 +0000 UTC" deadline="2028-01-22 16:37:07.62971712 +0000 UTC" Apr 22 18:42:45.098208 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.098202 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15357h54m22.531519449s" Apr 22 18:42:45.101070 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.101054 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:42:45.137296 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.137238 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vc6ws" Apr 22 18:42:45.153320 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.153164 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vc6ws" Apr 22 18:42:45.172085 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:45.172045 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:45.227649 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:45.227604 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod150594baeab35cb17ccfa66548a34222.slice/crio-0b2f9f9eb22656982ab67752816b051725d6a37febc2acabdad5b62d7cbe787b WatchSource:0}: Error finding container 0b2f9f9eb22656982ab67752816b051725d6a37febc2acabdad5b62d7cbe787b: Status 404 returned error can't find the container with id 0b2f9f9eb22656982ab67752816b051725d6a37febc2acabdad5b62d7cbe787b Apr 22 18:42:45.228163 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:45.228138 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77d0c29eeab8147de5aeed09c8b86101.slice/crio-caf009cab56f2fd9975b5ebd473da4e26d1e13484cb47fb45b2907924a87e93b WatchSource:0}: Error finding container caf009cab56f2fd9975b5ebd473da4e26d1e13484cb47fb45b2907924a87e93b: Status 404 returned error can't find the container with id caf009cab56f2fd9975b5ebd473da4e26d1e13484cb47fb45b2907924a87e93b Apr 22 18:42:45.231999 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.231987 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:42:45.273060 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:45.273024 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:45.373528 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:45.373486 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:45.474129 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:45.474036 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:45.574839 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:45.574809 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-32.ec2.internal\" not found" Apr 22 18:42:45.608689 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.608663 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:45.681103 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.681065 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" Apr 22 18:42:45.697161 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.697124 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:42:45.698196 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.698163 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal" Apr 22 18:42:45.809398 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:45.809324 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:42:46.062762 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.062672 2575 apiserver.go:52] "Watching apiserver" Apr 22 18:42:46.069640 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.069614 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:42:46.071500 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.071460 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7zmbr","openshift-cluster-node-tuning-operator/tuned-6cfpf","openshift-image-registry/node-ca-h7ks7","openshift-network-diagnostics/network-check-target-ql9wt","openshift-network-operator/iptables-alerter-kjkxq","openshift-ovn-kubernetes/ovnkube-node-hwf7s","kube-system/konnectivity-agent-985jk","kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf","openshift-dns/node-resolver-5jr6w","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal","openshift-multus/multus-additional-cni-plugins-6x6wq","openshift-multus/multus-sspfl"] Apr 22 18:42:46.073971 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.073952 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-985jk" Apr 22 18:42:46.076169 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.076150 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.076309 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.076300 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:42:46.076560 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.076543 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:42:46.076689 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.076664 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zqm56\"" Apr 22 18:42:46.078342 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.078324 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:42:46.078342 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.078336 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h7ks7" Apr 22 18:42:46.078675 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.078658 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-4xfr4\"" Apr 22 18:42:46.078784 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.078759 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:42:46.080397 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.080355 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-75794\"" Apr 22 18:42:46.080538 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.080519 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:42:46.081082 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.081057 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:42:46.081228 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.081192 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:42:46.083601 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.083584 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:46.083707 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.083681 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:42:46.085848 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.085818 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kjkxq" Apr 22 18:42:46.086225 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.085942 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.088255 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.088229 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:46.088351 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.088291 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:42:46.088351 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.088300 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:42:46.088579 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.088449 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:42:46.089412 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.089388 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:42:46.089412 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.089400 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6725v\"" Apr 22 18:42:46.089551 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.089432 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:42:46.089551 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.089395 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:42:46.089551 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.089523 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:42:46.089959 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.089940 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:42:46.090050 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.089949 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:42:46.090050 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.089949 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:42:46.090050 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.090032 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dqsnv\"" Apr 22 18:42:46.090553 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.090536 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.092704 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.092674 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cc6477a3-da8c-40f7-ae67-bf32ede541af-serviceca\") pod \"node-ca-h7ks7\" (UID: \"cc6477a3-da8c-40f7-ae67-bf32ede541af\") " pod="openshift-image-registry/node-ca-h7ks7" Apr 22 18:42:46.092817 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.092722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8cd060fb-c75b-48aa-888f-5176f41de266-iptables-alerter-script\") pod \"iptables-alerter-kjkxq\" (UID: \"8cd060fb-c75b-48aa-888f-5176f41de266\") " pod="openshift-network-operator/iptables-alerter-kjkxq" Apr 22 18:42:46.092817 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.092745 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:42:46.092817 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.092747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8cd060fb-c75b-48aa-888f-5176f41de266-host-slash\") pod \"iptables-alerter-kjkxq\" (UID: \"8cd060fb-c75b-48aa-888f-5176f41de266\") " pod="openshift-network-operator/iptables-alerter-kjkxq" Apr 22 18:42:46.092817 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.092789 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:42:46.093064 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093050 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:42:46.093117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093054 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xnt57\"" Apr 22 18:42:46.093117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093077 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5jr6w" Apr 22 18:42:46.093415 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093391 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-kubernetes\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.093493 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-sys\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.093493 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-tuned\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.093588 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqpht\" (UniqueName: \"kubernetes.io/projected/cc6477a3-da8c-40f7-ae67-bf32ede541af-kube-api-access-kqpht\") pod \"node-ca-h7ks7\" (UID: \"cc6477a3-da8c-40f7-ae67-bf32ede541af\") " pod="openshift-image-registry/node-ca-h7ks7" Apr 22 18:42:46.093622 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093582 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wfh4\" (UniqueName: \"kubernetes.io/projected/8cd060fb-c75b-48aa-888f-5176f41de266-kube-api-access-4wfh4\") pod \"iptables-alerter-kjkxq\" (UID: \"8cd060fb-c75b-48aa-888f-5176f41de266\") " pod="openshift-network-operator/iptables-alerter-kjkxq" Apr 22 18:42:46.093622 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-modprobe-d\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.093685 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093635 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-sysconfig\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.093685 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-sysctl-conf\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.093800 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-run\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.093800 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-lib-modules\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.093800 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-var-lib-kubelet\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.093800 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc6477a3-da8c-40f7-ae67-bf32ede541af-host\") pod \"node-ca-h7ks7\" (UID: \"cc6477a3-da8c-40f7-ae67-bf32ede541af\") " pod="openshift-image-registry/node-ca-h7ks7" Apr 22 18:42:46.093800 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-sysctl-d\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.094018 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093816 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-systemd\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.094018 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/047be1c6-96f6-47bd-80c1-539db0f3b59c-tmp\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.094018 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcfnl\" (UniqueName: \"kubernetes.io/projected/047be1c6-96f6-47bd-80c1-539db0f3b59c-kube-api-access-fcfnl\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.094018 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093896 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8xjc\" (UniqueName: \"kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc\") pod \"network-check-target-ql9wt\" (UID: \"3ff2e0de-9129-4246-968b-183ec5c37452\") " pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:46.094018 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093919 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ec13a05a-498a-4a5e-a065-7e57635aafff-agent-certs\") pod \"konnectivity-agent-985jk\" (UID: \"ec13a05a-498a-4a5e-a065-7e57635aafff\") " pod="kube-system/konnectivity-agent-985jk" Apr 22 18:42:46.094018 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ec13a05a-498a-4a5e-a065-7e57635aafff-konnectivity-ca\") pod \"konnectivity-agent-985jk\" (UID: \"ec13a05a-498a-4a5e-a065-7e57635aafff\") " pod="kube-system/konnectivity-agent-985jk" Apr 22 18:42:46.094018 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.093985 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-host\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.095249 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.095172 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:42:46.095575 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.095434 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hztd2\"" Apr 22 18:42:46.095575 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.095494 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:42:46.095722 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.095600 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.097843 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.097827 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:42:46.097941 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.097877 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.098003 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.097991 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:42:46.098054 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.098005 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nd27q\"" Apr 22 18:42:46.098180 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.098164 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:42:46.098237 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.098198 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:42:46.098293 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.098276 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:42:46.100156 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.100135 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:42:46.100342 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.100323 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g422j\"" Apr 22 18:42:46.153925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.153897 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:37:45 +0000 UTC" deadline="2027-12-20 02:10:21.324096081 +0000 UTC" Apr 22 18:42:46.153925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.153924 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14551h27m35.170175535s" Apr 22 18:42:46.159281 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.159265 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:46.182328 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.182305 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:42:46.193244 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.193190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal" event={"ID":"150594baeab35cb17ccfa66548a34222","Type":"ContainerStarted","Data":"0b2f9f9eb22656982ab67752816b051725d6a37febc2acabdad5b62d7cbe787b"} Apr 22 18:42:46.194183 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" event={"ID":"77d0c29eeab8147de5aeed09c8b86101","Type":"ContainerStarted","Data":"caf009cab56f2fd9975b5ebd473da4e26d1e13484cb47fb45b2907924a87e93b"} Apr 22 18:42:46.194273 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-cni-netd\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.194273 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-tuned\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.194273 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqpht\" (UniqueName: \"kubernetes.io/projected/cc6477a3-da8c-40f7-ae67-bf32ede541af-kube-api-access-kqpht\") pod \"node-ca-h7ks7\" (UID: \"cc6477a3-da8c-40f7-ae67-bf32ede541af\") " pod="openshift-image-registry/node-ca-h7ks7" Apr 22 18:42:46.194273 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wfh4\" (UniqueName: \"kubernetes.io/projected/8cd060fb-c75b-48aa-888f-5176f41de266-kube-api-access-4wfh4\") pod \"iptables-alerter-kjkxq\" (UID: \"8cd060fb-c75b-48aa-888f-5176f41de266\") " pod="openshift-network-operator/iptables-alerter-kjkxq" Apr 22 18:42:46.194458 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-kubelet\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.194458 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc6477a3-da8c-40f7-ae67-bf32ede541af-host\") pod \"node-ca-h7ks7\" (UID: \"cc6477a3-da8c-40f7-ae67-bf32ede541af\") " pod="openshift-image-registry/node-ca-h7ks7" Apr 22 18:42:46.194458 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xjc\" (UniqueName: \"kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc\") pod \"network-check-target-ql9wt\" (UID: \"3ff2e0de-9129-4246-968b-183ec5c37452\") " pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:46.194458 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194350 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-slash\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.194458 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-cnibin\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.194458 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcfnl\" (UniqueName: \"kubernetes.io/projected/047be1c6-96f6-47bd-80c1-539db0f3b59c-kube-api-access-fcfnl\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.194458 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194403 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc6477a3-da8c-40f7-ae67-bf32ede541af-host\") pod \"node-ca-h7ks7\" (UID: \"cc6477a3-da8c-40f7-ae67-bf32ede541af\") " pod="openshift-image-registry/node-ca-h7ks7" Apr 22 18:42:46.194458 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-ovnkube-script-lib\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.194458 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-run-k8s-cni-cncf-io\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f44ss\" (UniqueName: \"kubernetes.io/projected/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-kube-api-access-f44ss\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ec13a05a-498a-4a5e-a065-7e57635aafff-agent-certs\") pod \"konnectivity-agent-985jk\" (UID: \"ec13a05a-498a-4a5e-a065-7e57635aafff\") " pod="kube-system/konnectivity-agent-985jk" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-os-release\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5e2e434e-269f-4708-b72e-607842cf2bd9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-var-lib-kubelet\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194637 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-kubernetes\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-run-systemd\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-ovnkube-config\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194695 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3bf65c2b-0944-4d58-bd8b-923617359ff3-hosts-file\") pod \"node-resolver-5jr6w\" (UID: \"3bf65c2b-0944-4d58-bd8b-923617359ff3\") " pod="openshift-dns/node-resolver-5jr6w" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e2e434e-269f-4708-b72e-607842cf2bd9-cni-binary-copy\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-kubernetes\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-lib-modules\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-var-lib-kubelet\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194835 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-node-log\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-device-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.194914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmgvw\" (UniqueName: \"kubernetes.io/projected/19ace946-23b0-451c-93fa-078938130dd5-kube-api-access-pmgvw\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-cnibin\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-var-lib-cni-multus\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-lib-modules\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-sysctl-d\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-run-ovn\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-var-lib-kubelet\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.194984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-var-lib-cni-bin\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8cd060fb-c75b-48aa-888f-5176f41de266-iptables-alerter-script\") pod \"iptables-alerter-kjkxq\" (UID: \"8cd060fb-c75b-48aa-888f-5176f41de266\") " pod="openshift-network-operator/iptables-alerter-kjkxq" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195022 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-run-ovn-kubernetes\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195038 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-ovn-node-metrics-cert\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195059 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78gr4\" (UniqueName: \"kubernetes.io/projected/e7e79aca-b413-4120-897d-95784a08f56f-kube-api-access-78gr4\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-sysctl-d\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195170 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5e2e434e-269f-4708-b72e-607842cf2bd9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-sys\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.195641 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-etc-openvswitch\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-log-socket\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-sys\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195360 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-cni-binary-copy\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195406 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-socket-dir-parent\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195438 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-modprobe-d\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195465 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-sys-fs\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms6tc\" (UniqueName: \"kubernetes.io/projected/3bf65c2b-0944-4d58-bd8b-923617359ff3-kube-api-access-ms6tc\") pod \"node-resolver-5jr6w\" (UID: \"3bf65c2b-0944-4d58-bd8b-923617359ff3\") " pod="openshift-dns/node-resolver-5jr6w" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-run-netns\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8cd060fb-c75b-48aa-888f-5176f41de266-iptables-alerter-script\") pod \"iptables-alerter-kjkxq\" (UID: \"8cd060fb-c75b-48aa-888f-5176f41de266\") " pod="openshift-network-operator/iptables-alerter-kjkxq" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-systemd\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-modprobe-d\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-cni-dir\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195596 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-systemd\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-run-multus-certs\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-host\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195701 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cc6477a3-da8c-40f7-ae67-bf32ede541af-serviceca\") pod \"node-ca-h7ks7\" (UID: \"cc6477a3-da8c-40f7-ae67-bf32ede541af\") " pod="openshift-image-registry/node-ca-h7ks7" Apr 22 18:42:46.196401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8cd060fb-c75b-48aa-888f-5176f41de266-host-slash\") pod \"iptables-alerter-kjkxq\" (UID: \"8cd060fb-c75b-48aa-888f-5176f41de266\") " pod="openshift-network-operator/iptables-alerter-kjkxq" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-host\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-env-overrides\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-hostroot\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195812 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-run-netns\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195859 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-run-openvswitch\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195889 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-cni-bin\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-registration-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3bf65c2b-0944-4d58-bd8b-923617359ff3-tmp-dir\") pod \"node-resolver-5jr6w\" (UID: \"3bf65c2b-0944-4d58-bd8b-923617359ff3\") " pod="openshift-dns/node-resolver-5jr6w" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.195990 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktxm6\" (UniqueName: \"kubernetes.io/projected/5e2e434e-269f-4708-b72e-607842cf2bd9-kube-api-access-ktxm6\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196046 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-os-release\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-sysconfig\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-sysctl-conf\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-run\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-socket-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196162 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cc6477a3-da8c-40f7-ae67-bf32ede541af-serviceca\") pod \"node-ca-h7ks7\" (UID: \"cc6477a3-da8c-40f7-ae67-bf32ede541af\") " pod="openshift-image-registry/node-ca-h7ks7" Apr 22 18:42:46.197199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196208 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-etc-selinux\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8cd060fb-c75b-48aa-888f-5176f41de266-host-slash\") pod \"iptables-alerter-kjkxq\" (UID: \"8cd060fb-c75b-48aa-888f-5176f41de266\") " pod="openshift-network-operator/iptables-alerter-kjkxq" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-system-cni-dir\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-sysconfig\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/047be1c6-96f6-47bd-80c1-539db0f3b59c-tmp\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-systemd-units\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-system-cni-dir\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196383 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-conf-dir\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ec13a05a-498a-4a5e-a065-7e57635aafff-konnectivity-ca\") pod \"konnectivity-agent-985jk\" (UID: \"ec13a05a-498a-4a5e-a065-7e57635aafff\") " pod="kube-system/konnectivity-agent-985jk" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-var-lib-openvswitch\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-sysctl-conf\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw644\" (UniqueName: \"kubernetes.io/projected/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-kube-api-access-fw644\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/047be1c6-96f6-47bd-80c1-539db0f3b59c-run\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196522 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-daemon-config\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.196576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-etc-kubernetes\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.197925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.197148 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ec13a05a-498a-4a5e-a065-7e57635aafff-konnectivity-ca\") pod \"konnectivity-agent-985jk\" (UID: \"ec13a05a-498a-4a5e-a065-7e57635aafff\") " pod="kube-system/konnectivity-agent-985jk" Apr 22 18:42:46.198611 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.198179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/047be1c6-96f6-47bd-80c1-539db0f3b59c-etc-tuned\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.198611 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.198452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ec13a05a-498a-4a5e-a065-7e57635aafff-agent-certs\") pod \"konnectivity-agent-985jk\" (UID: \"ec13a05a-498a-4a5e-a065-7e57635aafff\") " pod="kube-system/konnectivity-agent-985jk" Apr 22 18:42:46.198694 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.198620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/047be1c6-96f6-47bd-80c1-539db0f3b59c-tmp\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.201385 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.201365 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:46.201488 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.201388 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:46.201488 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.201401 2575 projected.go:194] Error preparing data for projected volume kube-api-access-q8xjc for pod openshift-network-diagnostics/network-check-target-ql9wt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:46.201488 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.201472 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc podName:3ff2e0de-9129-4246-968b-183ec5c37452 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:46.701443808 +0000 UTC m=+2.941076159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-q8xjc" (UniqueName: "kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc") pod "network-check-target-ql9wt" (UID: "3ff2e0de-9129-4246-968b-183ec5c37452") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:46.203575 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.203555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wfh4\" (UniqueName: \"kubernetes.io/projected/8cd060fb-c75b-48aa-888f-5176f41de266-kube-api-access-4wfh4\") pod \"iptables-alerter-kjkxq\" (UID: \"8cd060fb-c75b-48aa-888f-5176f41de266\") " pod="openshift-network-operator/iptables-alerter-kjkxq" Apr 22 18:42:46.203807 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.203785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqpht\" (UniqueName: \"kubernetes.io/projected/cc6477a3-da8c-40f7-ae67-bf32ede541af-kube-api-access-kqpht\") pod \"node-ca-h7ks7\" (UID: \"cc6477a3-da8c-40f7-ae67-bf32ede541af\") " pod="openshift-image-registry/node-ca-h7ks7" Apr 22 18:42:46.206574 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.206554 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcfnl\" (UniqueName: \"kubernetes.io/projected/047be1c6-96f6-47bd-80c1-539db0f3b59c-kube-api-access-fcfnl\") pod \"tuned-6cfpf\" (UID: \"047be1c6-96f6-47bd-80c1-539db0f3b59c\") " pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.296885 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.296851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-run-openvswitch\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.296885 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.296893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-cni-bin\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.297117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.296919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-registration-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.297117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.296944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3bf65c2b-0944-4d58-bd8b-923617359ff3-tmp-dir\") pod \"node-resolver-5jr6w\" (UID: \"3bf65c2b-0944-4d58-bd8b-923617359ff3\") " pod="openshift-dns/node-resolver-5jr6w" Apr 22 18:42:46.297117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.296954 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-run-openvswitch\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.297117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.296969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.297117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.296976 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-cni-bin\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.297117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297011 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktxm6\" (UniqueName: \"kubernetes.io/projected/5e2e434e-269f-4708-b72e-607842cf2bd9-kube-api-access-ktxm6\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.297117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-os-release\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.297117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-socket-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.297117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-registration-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.297117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-etc-selinux\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-system-cni-dir\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297268 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-socket-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297201 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-system-cni-dir\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297292 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-systemd-units\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.297251 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-os-release\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297319 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-system-cni-dir\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-etc-selinux\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-conf-dir\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-systemd-units\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297372 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-system-cni-dir\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.297377 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs podName:19ace946-23b0-451c-93fa-078938130dd5 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:46.797356361 +0000 UTC m=+3.036988722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs") pod "network-metrics-daemon-7zmbr" (UID: "19ace946-23b0-451c-93fa-078938130dd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297395 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-conf-dir\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-var-lib-openvswitch\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.297550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297445 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-var-lib-openvswitch\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297457 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fw644\" (UniqueName: \"kubernetes.io/projected/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-kube-api-access-fw644\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297496 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-daemon-config\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-etc-kubernetes\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-cni-netd\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-kubelet\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-slash\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-cnibin\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297721 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-kubelet\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-cni-netd\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-etc-kubernetes\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-ovnkube-script-lib\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-slash\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-run-k8s-cni-cncf-io\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f44ss\" (UniqueName: \"kubernetes.io/projected/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-kube-api-access-f44ss\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297834 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-cnibin\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.298418 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-os-release\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3bf65c2b-0944-4d58-bd8b-923617359ff3-tmp-dir\") pod \"node-resolver-5jr6w\" (UID: \"3bf65c2b-0944-4d58-bd8b-923617359ff3\") " pod="openshift-dns/node-resolver-5jr6w" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297884 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-run-k8s-cni-cncf-io\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297937 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-os-release\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.297973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5e2e434e-269f-4708-b72e-607842cf2bd9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-var-lib-kubelet\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-run-systemd\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-ovnkube-config\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-daemon-config\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3bf65c2b-0944-4d58-bd8b-923617359ff3-hosts-file\") pod \"node-resolver-5jr6w\" (UID: \"3bf65c2b-0944-4d58-bd8b-923617359ff3\") " pod="openshift-dns/node-resolver-5jr6w" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298117 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3bf65c2b-0944-4d58-bd8b-923617359ff3-hosts-file\") pod \"node-resolver-5jr6w\" (UID: \"3bf65c2b-0944-4d58-bd8b-923617359ff3\") " pod="openshift-dns/node-resolver-5jr6w" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e2e434e-269f-4708-b72e-607842cf2bd9-cni-binary-copy\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298149 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-node-log\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-device-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmgvw\" (UniqueName: \"kubernetes.io/projected/19ace946-23b0-451c-93fa-078938130dd5-kube-api-access-pmgvw\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-run-systemd\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-ovnkube-script-lib\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.299269 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298242 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-cnibin\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-var-lib-kubelet\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-var-lib-cni-multus\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-device-dir\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-run-ovn\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-var-lib-cni-bin\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298399 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-run-ovn-kubernetes\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-ovn-node-metrics-cert\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78gr4\" (UniqueName: \"kubernetes.io/projected/e7e79aca-b413-4120-897d-95784a08f56f-kube-api-access-78gr4\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5e2e434e-269f-4708-b72e-607842cf2bd9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5e2e434e-269f-4708-b72e-607842cf2bd9-cnibin\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-etc-openvswitch\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-log-socket\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-cni-binary-copy\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-socket-dir-parent\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-sys-fs\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.299934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-ovnkube-config\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e7e79aca-b413-4120-897d-95784a08f56f-sys-fs\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-var-lib-cni-multus\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298726 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-socket-dir-parent\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-log-socket\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-run-ovn\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298802 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-var-lib-cni-bin\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-run-ovn-kubernetes\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.298994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299034 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-node-log\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ms6tc\" (UniqueName: \"kubernetes.io/projected/3bf65c2b-0944-4d58-bd8b-923617359ff3-kube-api-access-ms6tc\") pod \"node-resolver-5jr6w\" (UID: \"3bf65c2b-0944-4d58-bd8b-923617359ff3\") " pod="openshift-dns/node-resolver-5jr6w" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-run-netns\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-cni-binary-copy\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-cni-dir\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-run-multus-certs\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-run-netns\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299190 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5e2e434e-269f-4708-b72e-607842cf2bd9-cni-binary-copy\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299194 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-env-overrides\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.300517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-hostroot\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.301202 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-run-netns\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.301202 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-multus-cni-dir\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.301202 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299305 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-host-run-netns\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.301202 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299309 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-hostroot\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.301202 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299318 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-host-run-multus-certs\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.301202 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5e2e434e-269f-4708-b72e-607842cf2bd9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.301202 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299479 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-etc-openvswitch\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.301202 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-env-overrides\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.301202 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.299587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5e2e434e-269f-4708-b72e-607842cf2bd9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.301617 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.301595 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-ovn-node-metrics-cert\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.308973 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.308932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f44ss\" (UniqueName: \"kubernetes.io/projected/366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f-kube-api-access-f44ss\") pod \"multus-sspfl\" (UID: \"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f\") " pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.310704 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.310680 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktxm6\" (UniqueName: \"kubernetes.io/projected/5e2e434e-269f-4708-b72e-607842cf2bd9-kube-api-access-ktxm6\") pod \"multus-additional-cni-plugins-6x6wq\" (UID: \"5e2e434e-269f-4708-b72e-607842cf2bd9\") " pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.310894 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.310876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw644\" (UniqueName: \"kubernetes.io/projected/ea3f4bad-3513-4bfe-9cd3-e706b42dc86c-kube-api-access-fw644\") pod \"ovnkube-node-hwf7s\" (UID: \"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c\") " pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.311444 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.311426 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmgvw\" (UniqueName: \"kubernetes.io/projected/19ace946-23b0-451c-93fa-078938130dd5-kube-api-access-pmgvw\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:46.311525 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.311511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78gr4\" (UniqueName: \"kubernetes.io/projected/e7e79aca-b413-4120-897d-95784a08f56f-kube-api-access-78gr4\") pod \"aws-ebs-csi-driver-node-26nvf\" (UID: \"e7e79aca-b413-4120-897d-95784a08f56f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.312118 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.312097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms6tc\" (UniqueName: \"kubernetes.io/projected/3bf65c2b-0944-4d58-bd8b-923617359ff3-kube-api-access-ms6tc\") pod \"node-resolver-5jr6w\" (UID: \"3bf65c2b-0944-4d58-bd8b-923617359ff3\") " pod="openshift-dns/node-resolver-5jr6w" Apr 22 18:42:46.392228 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.392143 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-985jk" Apr 22 18:42:46.398918 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.398897 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" Apr 22 18:42:46.406515 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.406493 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h7ks7" Apr 22 18:42:46.413090 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.413066 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kjkxq" Apr 22 18:42:46.418686 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.418668 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:42:46.425281 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.425263 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" Apr 22 18:42:46.431800 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.431767 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5jr6w" Apr 22 18:42:46.437340 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.437321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6x6wq" Apr 22 18:42:46.442872 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.442856 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sspfl" Apr 22 18:42:46.802307 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.802218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:46.802307 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:46.802269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xjc\" (UniqueName: \"kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc\") pod \"network-check-target-ql9wt\" (UID: \"3ff2e0de-9129-4246-968b-183ec5c37452\") " pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:46.802524 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.802377 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:46.802524 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.802387 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:46.802524 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.802403 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:46.802524 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.802411 2575 projected.go:194] Error preparing data for projected volume kube-api-access-q8xjc for pod openshift-network-diagnostics/network-check-target-ql9wt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:46.802524 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.802448 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs podName:19ace946-23b0-451c-93fa-078938130dd5 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:47.802429355 +0000 UTC m=+4.042061700 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs") pod "network-metrics-daemon-7zmbr" (UID: "19ace946-23b0-451c-93fa-078938130dd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:46.802524 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:46.802468 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc podName:3ff2e0de-9129-4246-968b-183ec5c37452 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:47.802457407 +0000 UTC m=+4.042089757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-q8xjc" (UniqueName: "kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc") pod "network-check-target-ql9wt" (UID: "3ff2e0de-9129-4246-968b-183ec5c37452") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:46.892140 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:46.892112 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd060fb_c75b_48aa_888f_5176f41de266.slice/crio-d5e5fb35203c725f6838c5212f254bd0a1c98c437d44386f3abf4427cb5b95b7 WatchSource:0}: Error finding container d5e5fb35203c725f6838c5212f254bd0a1c98c437d44386f3abf4427cb5b95b7: Status 404 returned error can't find the container with id d5e5fb35203c725f6838c5212f254bd0a1c98c437d44386f3abf4427cb5b95b7 Apr 22 18:42:46.893032 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:46.892995 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea3f4bad_3513_4bfe_9cd3_e706b42dc86c.slice/crio-59699b2c01b1705fb3587910fc24ea884b65be63ae1a826e71ea1bba466e1011 WatchSource:0}: Error finding container 59699b2c01b1705fb3587910fc24ea884b65be63ae1a826e71ea1bba466e1011: Status 404 returned error can't find the container with id 59699b2c01b1705fb3587910fc24ea884b65be63ae1a826e71ea1bba466e1011 Apr 22 18:42:46.893765 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:46.893745 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bf65c2b_0944_4d58_bd8b_923617359ff3.slice/crio-59f3f3ac1231fffcd24e4e15fa6611ebf81983b2d29a7927b422a2ede3383b58 WatchSource:0}: Error finding container 59f3f3ac1231fffcd24e4e15fa6611ebf81983b2d29a7927b422a2ede3383b58: Status 404 returned error can't find the container with id 59f3f3ac1231fffcd24e4e15fa6611ebf81983b2d29a7927b422a2ede3383b58 Apr 22 18:42:46.894605 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:46.894574 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec13a05a_498a_4a5e_a065_7e57635aafff.slice/crio-addb5f804c729190a646af56e084c1f172b8359ec4a8bef76bc0a0eb549a6342 WatchSource:0}: Error finding container addb5f804c729190a646af56e084c1f172b8359ec4a8bef76bc0a0eb549a6342: Status 404 returned error can't find the container with id addb5f804c729190a646af56e084c1f172b8359ec4a8bef76bc0a0eb549a6342 Apr 22 18:42:46.897260 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:46.897167 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7e79aca_b413_4120_897d_95784a08f56f.slice/crio-7fa11cc9fdf9d288423b0b21943bbefc02237e9654be4d42c34c54ae75368f9a WatchSource:0}: Error finding container 7fa11cc9fdf9d288423b0b21943bbefc02237e9654be4d42c34c54ae75368f9a: Status 404 returned error can't find the container with id 7fa11cc9fdf9d288423b0b21943bbefc02237e9654be4d42c34c54ae75368f9a Apr 22 18:42:46.899988 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:46.899729 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod366a36aa_b21c_49e8_8ed6_a85ab6ac5d4f.slice/crio-c2e59ecbdfbc5a29106dee67359bba4337093ef4c607349906c9047af4d2c831 WatchSource:0}: Error finding container c2e59ecbdfbc5a29106dee67359bba4337093ef4c607349906c9047af4d2c831: Status 404 returned error can't find the container with id c2e59ecbdfbc5a29106dee67359bba4337093ef4c607349906c9047af4d2c831 Apr 22 18:42:46.901225 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:46.901110 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod047be1c6_96f6_47bd_80c1_539db0f3b59c.slice/crio-578530f9d05e4b2f04d31ff7727caaff6791a6a89fd374145700f82ed31bdab4 WatchSource:0}: Error finding container 578530f9d05e4b2f04d31ff7727caaff6791a6a89fd374145700f82ed31bdab4: Status 404 returned error can't find the container with id 578530f9d05e4b2f04d31ff7727caaff6791a6a89fd374145700f82ed31bdab4 Apr 22 18:42:46.902124 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:46.902086 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc6477a3_da8c_40f7_ae67_bf32ede541af.slice/crio-627d6f6a69f5702497602a5a2778ffe265de460442106f1126391621503431ff WatchSource:0}: Error finding container 627d6f6a69f5702497602a5a2778ffe265de460442106f1126391621503431ff: Status 404 returned error can't find the container with id 627d6f6a69f5702497602a5a2778ffe265de460442106f1126391621503431ff Apr 22 18:42:46.902894 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:42:46.902872 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e2e434e_269f_4708_b72e_607842cf2bd9.slice/crio-23a8e05e91a1f76fffc431daf17d14de0c1275b17eabf939c3c86c3ec1aaddeb WatchSource:0}: Error finding container 23a8e05e91a1f76fffc431daf17d14de0c1275b17eabf939c3c86c3ec1aaddeb: Status 404 returned error can't find the container with id 23a8e05e91a1f76fffc431daf17d14de0c1275b17eabf939c3c86c3ec1aaddeb Apr 22 18:42:47.155095 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.155010 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:37:45 +0000 UTC" deadline="2028-01-07 09:05:35.43555004 +0000 UTC" Apr 22 18:42:47.155095 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.155041 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14990h22m48.280511879s" Apr 22 18:42:47.196954 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.196918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" event={"ID":"e7e79aca-b413-4120-897d-95784a08f56f","Type":"ContainerStarted","Data":"7fa11cc9fdf9d288423b0b21943bbefc02237e9654be4d42c34c54ae75368f9a"} Apr 22 18:42:47.197884 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.197852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-985jk" event={"ID":"ec13a05a-498a-4a5e-a065-7e57635aafff","Type":"ContainerStarted","Data":"addb5f804c729190a646af56e084c1f172b8359ec4a8bef76bc0a0eb549a6342"} Apr 22 18:42:47.199870 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.199847 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal" event={"ID":"150594baeab35cb17ccfa66548a34222","Type":"ContainerStarted","Data":"3c937aedad726f8f90e8c798d11a56ac14df6dec62a06fd005af73f480fe56b1"} Apr 22 18:42:47.201028 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.200997 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6x6wq" event={"ID":"5e2e434e-269f-4708-b72e-607842cf2bd9","Type":"ContainerStarted","Data":"23a8e05e91a1f76fffc431daf17d14de0c1275b17eabf939c3c86c3ec1aaddeb"} Apr 22 18:42:47.202168 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.202145 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h7ks7" event={"ID":"cc6477a3-da8c-40f7-ae67-bf32ede541af","Type":"ContainerStarted","Data":"627d6f6a69f5702497602a5a2778ffe265de460442106f1126391621503431ff"} Apr 22 18:42:47.203166 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.203144 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5jr6w" event={"ID":"3bf65c2b-0944-4d58-bd8b-923617359ff3","Type":"ContainerStarted","Data":"59f3f3ac1231fffcd24e4e15fa6611ebf81983b2d29a7927b422a2ede3383b58"} Apr 22 18:42:47.204473 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.204451 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" event={"ID":"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c","Type":"ContainerStarted","Data":"59699b2c01b1705fb3587910fc24ea884b65be63ae1a826e71ea1bba466e1011"} Apr 22 18:42:47.205571 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.205547 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kjkxq" event={"ID":"8cd060fb-c75b-48aa-888f-5176f41de266","Type":"ContainerStarted","Data":"d5e5fb35203c725f6838c5212f254bd0a1c98c437d44386f3abf4427cb5b95b7"} Apr 22 18:42:47.206814 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.206790 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" event={"ID":"047be1c6-96f6-47bd-80c1-539db0f3b59c","Type":"ContainerStarted","Data":"578530f9d05e4b2f04d31ff7727caaff6791a6a89fd374145700f82ed31bdab4"} Apr 22 18:42:47.207834 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.207811 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sspfl" event={"ID":"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f","Type":"ContainerStarted","Data":"c2e59ecbdfbc5a29106dee67359bba4337093ef4c607349906c9047af4d2c831"} Apr 22 18:42:47.220408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.220366 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal" podStartSLOduration=2.220355861 podStartE2EDuration="2.220355861s" podCreationTimestamp="2026-04-22 18:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:42:47.220149315 +0000 UTC m=+3.459781674" watchObservedRunningTime="2026-04-22 18:42:47.220355861 +0000 UTC m=+3.459988219" Apr 22 18:42:47.235450 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.235429 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:47.809716 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.809631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xjc\" (UniqueName: \"kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc\") pod \"network-check-target-ql9wt\" (UID: \"3ff2e0de-9129-4246-968b-183ec5c37452\") " pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:47.809716 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:47.809711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:47.809948 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:47.809844 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:47.809948 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:47.809906 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs podName:19ace946-23b0-451c-93fa-078938130dd5 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:49.809887765 +0000 UTC m=+6.049520105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs") pod "network-metrics-daemon-7zmbr" (UID: "19ace946-23b0-451c-93fa-078938130dd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:47.810331 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:47.810311 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:47.810405 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:47.810337 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:47.810405 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:47.810350 2575 projected.go:194] Error preparing data for projected volume kube-api-access-q8xjc for pod openshift-network-diagnostics/network-check-target-ql9wt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:47.810405 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:47.810395 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc podName:3ff2e0de-9129-4246-968b-183ec5c37452 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:49.810379709 +0000 UTC m=+6.050012049 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-q8xjc" (UniqueName: "kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc") pod "network-check-target-ql9wt" (UID: "3ff2e0de-9129-4246-968b-183ec5c37452") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:48.191975 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:48.191899 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:48.192409 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:48.192036 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:42:48.192481 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:48.192413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:48.192530 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:48.192491 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:42:48.230798 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:48.230107 2575 generic.go:358] "Generic (PLEG): container finished" podID="77d0c29eeab8147de5aeed09c8b86101" containerID="7187dcdd090e4a44f882553bb8cf151c769bf9f44e036a017a4e8012972319d5" exitCode=0 Apr 22 18:42:48.230798 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:48.230736 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" event={"ID":"77d0c29eeab8147de5aeed09c8b86101","Type":"ContainerDied","Data":"7187dcdd090e4a44f882553bb8cf151c769bf9f44e036a017a4e8012972319d5"} Apr 22 18:42:49.241905 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:49.241253 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" event={"ID":"77d0c29eeab8147de5aeed09c8b86101","Type":"ContainerStarted","Data":"0b8e246b3f7cef29d0fdd4f3de4244424d429640c5b2e381edd5b5d8359c58bf"} Apr 22 18:42:49.833209 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:49.828966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:49.833209 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:49.829022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xjc\" (UniqueName: \"kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc\") pod \"network-check-target-ql9wt\" (UID: \"3ff2e0de-9129-4246-968b-183ec5c37452\") " pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:49.833209 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:49.829220 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:49.833209 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:49.829265 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:49.833209 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:49.829277 2575 projected.go:194] Error preparing data for projected volume kube-api-access-q8xjc for pod openshift-network-diagnostics/network-check-target-ql9wt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:49.833209 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:49.829337 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc podName:3ff2e0de-9129-4246-968b-183ec5c37452 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:53.829318079 +0000 UTC m=+10.068950431 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-q8xjc" (UniqueName: "kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc") pod "network-check-target-ql9wt" (UID: "3ff2e0de-9129-4246-968b-183ec5c37452") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:49.833209 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:49.829700 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:49.833209 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:49.829743 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs podName:19ace946-23b0-451c-93fa-078938130dd5 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:53.829729149 +0000 UTC m=+10.069361499 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs") pod "network-metrics-daemon-7zmbr" (UID: "19ace946-23b0-451c-93fa-078938130dd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:50.191027 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:50.189879 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:50.191027 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:50.190010 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:42:50.191027 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:50.190846 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:50.191027 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:50.190943 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:42:52.190357 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:52.189784 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:52.190357 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:52.189784 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:52.190357 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:52.190062 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:42:52.190357 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:52.190096 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:42:53.861252 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:53.861211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:53.861725 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:53.861282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xjc\" (UniqueName: \"kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc\") pod \"network-check-target-ql9wt\" (UID: \"3ff2e0de-9129-4246-968b-183ec5c37452\") " pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:53.861725 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:53.861398 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:53.861725 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:53.861409 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:53.861725 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:53.861424 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:53.861725 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:53.861436 2575 projected.go:194] Error preparing data for projected volume kube-api-access-q8xjc for pod openshift-network-diagnostics/network-check-target-ql9wt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:53.861725 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:53.861470 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs podName:19ace946-23b0-451c-93fa-078938130dd5 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:01.861450764 +0000 UTC m=+18.101083114 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs") pod "network-metrics-daemon-7zmbr" (UID: "19ace946-23b0-451c-93fa-078938130dd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:53.861725 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:53.861493 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc podName:3ff2e0de-9129-4246-968b-183ec5c37452 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:01.861481271 +0000 UTC m=+18.101113610 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-q8xjc" (UniqueName: "kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc") pod "network-check-target-ql9wt" (UID: "3ff2e0de-9129-4246-968b-183ec5c37452") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:54.190718 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:54.190536 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:54.190718 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:54.190666 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:42:54.190718 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:54.190706 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:54.191096 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:54.190789 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:42:56.189900 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:56.189643 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:56.190330 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:56.189748 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:56.190330 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:56.190012 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:42:56.190459 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:56.190431 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:42:58.190286 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:58.190254 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:42:58.190719 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:42:58.190292 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:42:58.190719 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:58.190382 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:42:58.190719 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:42:58.190499 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:43:00.189865 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:00.189832 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:00.189865 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:00.189856 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:00.190363 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:00.189968 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:43:00.190363 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:00.190087 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:43:01.925667 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:01.925636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:01.926158 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:01.925684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xjc\" (UniqueName: \"kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc\") pod \"network-check-target-ql9wt\" (UID: \"3ff2e0de-9129-4246-968b-183ec5c37452\") " pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:01.926158 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:01.925793 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:01.926158 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:01.925811 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:01.926158 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:01.925829 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:01.926158 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:01.925844 2575 projected.go:194] Error preparing data for projected volume kube-api-access-q8xjc for pod openshift-network-diagnostics/network-check-target-ql9wt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:01.926158 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:01.925852 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs podName:19ace946-23b0-451c-93fa-078938130dd5 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:17.925836113 +0000 UTC m=+34.165468454 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs") pod "network-metrics-daemon-7zmbr" (UID: "19ace946-23b0-451c-93fa-078938130dd5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:01.926158 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:01.925893 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc podName:3ff2e0de-9129-4246-968b-183ec5c37452 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:17.925877717 +0000 UTC m=+34.165510056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-q8xjc" (UniqueName: "kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc") pod "network-check-target-ql9wt" (UID: "3ff2e0de-9129-4246-968b-183ec5c37452") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:02.189962 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:02.189877 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:02.190120 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:02.190004 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:43:02.190243 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:02.189877 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:02.190361 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:02.190340 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:43:04.190847 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.190820 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:04.191449 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:04.190919 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:43:04.191449 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.190943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:04.191449 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:04.191043 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:43:04.264791 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.264746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" event={"ID":"047be1c6-96f6-47bd-80c1-539db0f3b59c","Type":"ContainerStarted","Data":"a44ef9a2e3f223388a256b81b6cc6f5f8ebaa3b9b785aadb7c3c9be35716153d"} Apr 22 18:43:04.266135 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.266109 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sspfl" event={"ID":"366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f","Type":"ContainerStarted","Data":"99b13a906626c85cd0007582789d877f51b7db58647b941a055d6ba841d6709f"} Apr 22 18:43:04.267484 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.267462 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" event={"ID":"e7e79aca-b413-4120-897d-95784a08f56f","Type":"ContainerStarted","Data":"ff0559cde478e0a8d935cef956c1793f4e9ca9480df6d952a362cad9ccbf1c45"} Apr 22 18:43:04.268652 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.268629 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-985jk" event={"ID":"ec13a05a-498a-4a5e-a065-7e57635aafff","Type":"ContainerStarted","Data":"c2b1cbccbef7e73dc44bf2678957b1dad6bc71e6d62fd2a2b81d412a787849d3"} Apr 22 18:43:04.270038 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.270010 2575 generic.go:358] "Generic (PLEG): container finished" podID="5e2e434e-269f-4708-b72e-607842cf2bd9" containerID="8896865ba0c91fb1981747693e90e2eee4b4ed19ed086f7e3b0f6b83f3a59e5a" exitCode=0 Apr 22 18:43:04.270115 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.270068 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6x6wq" event={"ID":"5e2e434e-269f-4708-b72e-607842cf2bd9","Type":"ContainerDied","Data":"8896865ba0c91fb1981747693e90e2eee4b4ed19ed086f7e3b0f6b83f3a59e5a"} Apr 22 18:43:04.271820 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.271496 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h7ks7" event={"ID":"cc6477a3-da8c-40f7-ae67-bf32ede541af","Type":"ContainerStarted","Data":"0c99627d387c5d1781b7e123fb5f655705c03fcadc5cf8ac218eb29dcf889f40"} Apr 22 18:43:04.272935 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.272915 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5jr6w" event={"ID":"3bf65c2b-0944-4d58-bd8b-923617359ff3","Type":"ContainerStarted","Data":"9b678ba79b3d312a7e9ba9c9be58f9a398766fb62fbf464c2e45c0a4bedd5c44"} Apr 22 18:43:04.275337 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.275319 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 18:43:04.275608 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.275589 2575 generic.go:358] "Generic (PLEG): container finished" podID="ea3f4bad-3513-4bfe-9cd3-e706b42dc86c" containerID="206dd2d4e141cc5b93c250dbdae6b86632670f8eb954de437d064bac4fdc57e4" exitCode=1 Apr 22 18:43:04.275669 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.275620 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" event={"ID":"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c","Type":"ContainerStarted","Data":"494881c4579d16e6b9e5f6a9d3747d3887bda62f4fbc9bfa84fab0ed65e5b841"} Apr 22 18:43:04.275669 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.275639 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" event={"ID":"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c","Type":"ContainerStarted","Data":"c3621775c50de79ebef7f88733c31768c0a61e5c62fdabcfe6fabdc4d1dcb10d"} Apr 22 18:43:04.275669 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.275651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" event={"ID":"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c","Type":"ContainerStarted","Data":"ba4e9b4e01e503475cc1833e929433198e1c07cb3456ff9461e8fb28ddc95485"} Apr 22 18:43:04.275669 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.275663 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" event={"ID":"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c","Type":"ContainerDied","Data":"206dd2d4e141cc5b93c250dbdae6b86632670f8eb954de437d064bac4fdc57e4"} Apr 22 18:43:04.275908 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.275676 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" event={"ID":"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c","Type":"ContainerStarted","Data":"f857c3b1df51297a85c8a1f348321e26d26641dfd125846d0721050fd74858b1"} Apr 22 18:43:04.300267 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.300232 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-32.ec2.internal" podStartSLOduration=19.300220929 podStartE2EDuration="19.300220929s" podCreationTimestamp="2026-04-22 18:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:42:49.259416742 +0000 UTC m=+5.499049103" watchObservedRunningTime="2026-04-22 18:43:04.300220929 +0000 UTC m=+20.539853288" Apr 22 18:43:04.332128 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.332082 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6cfpf" podStartSLOduration=3.835612733 podStartE2EDuration="20.332069216s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:42:46.904285674 +0000 UTC m=+3.143918016" lastFinishedPulling="2026-04-22 18:43:03.400742155 +0000 UTC m=+19.640374499" observedRunningTime="2026-04-22 18:43:04.3006816 +0000 UTC m=+20.540313961" watchObservedRunningTime="2026-04-22 18:43:04.332069216 +0000 UTC m=+20.571701574" Apr 22 18:43:04.332500 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.332473 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5jr6w" podStartSLOduration=3.827906493 podStartE2EDuration="20.332468671s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:42:46.896180952 +0000 UTC m=+3.135813303" lastFinishedPulling="2026-04-22 18:43:03.400743143 +0000 UTC m=+19.640375481" observedRunningTime="2026-04-22 18:43:04.331979101 +0000 UTC m=+20.571611485" watchObservedRunningTime="2026-04-22 18:43:04.332468671 +0000 UTC m=+20.572101029" Apr 22 18:43:04.406027 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.405851 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sspfl" podStartSLOduration=3.862435504 podStartE2EDuration="20.405837911s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:42:46.90193369 +0000 UTC m=+3.141566037" lastFinishedPulling="2026-04-22 18:43:03.445336107 +0000 UTC m=+19.684968444" observedRunningTime="2026-04-22 18:43:04.370008326 +0000 UTC m=+20.609640687" watchObservedRunningTime="2026-04-22 18:43:04.405837911 +0000 UTC m=+20.645470269" Apr 22 18:43:04.430678 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.430588 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-985jk" podStartSLOduration=3.916727551 podStartE2EDuration="20.430572052s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:42:46.89643245 +0000 UTC m=+3.136064801" lastFinishedPulling="2026-04-22 18:43:03.410276955 +0000 UTC m=+19.649909302" observedRunningTime="2026-04-22 18:43:04.406449278 +0000 UTC m=+20.646081638" watchObservedRunningTime="2026-04-22 18:43:04.430572052 +0000 UTC m=+20.670204411" Apr 22 18:43:04.480429 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.480382 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-h7ks7" podStartSLOduration=3.98387214 podStartE2EDuration="20.480370091s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:42:46.904264457 +0000 UTC m=+3.143896809" lastFinishedPulling="2026-04-22 18:43:03.40076241 +0000 UTC m=+19.640394760" observedRunningTime="2026-04-22 18:43:04.431150871 +0000 UTC m=+20.670783233" watchObservedRunningTime="2026-04-22 18:43:04.480370091 +0000 UTC m=+20.720002449" Apr 22 18:43:04.944687 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:04.944658 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:43:05.149634 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:05.149479 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:43:04.944682852Z","UUID":"34927d66-8de2-4dc0-bd39-c65bb2265da4","Handler":null,"Name":"","Endpoint":""} Apr 22 18:43:05.150907 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:05.150883 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:43:05.150994 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:05.150913 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:43:05.279675 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:05.279646 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 18:43:05.280245 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:05.280029 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" event={"ID":"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c","Type":"ContainerStarted","Data":"c63e67d31d35b6865535a8752422b4b82d53c9313cd19d2bce856ffa74d1c360"} Apr 22 18:43:05.281280 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:05.281258 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kjkxq" event={"ID":"8cd060fb-c75b-48aa-888f-5176f41de266","Type":"ContainerStarted","Data":"9321382c35773dfdb0ead3d38fdd0e766dcdbef00986c8b024f691a708f9e366"} Apr 22 18:43:05.282832 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:05.282756 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" event={"ID":"e7e79aca-b413-4120-897d-95784a08f56f","Type":"ContainerStarted","Data":"1236ad834f69946f928d82cce74aebd25d36b9366768ebbb5c5f45dc9d33ecc7"} Apr 22 18:43:05.303962 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:05.303921 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kjkxq" podStartSLOduration=4.781337637 podStartE2EDuration="21.303910313s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:42:46.893843327 +0000 UTC m=+3.133475678" lastFinishedPulling="2026-04-22 18:43:03.416416015 +0000 UTC m=+19.656048354" observedRunningTime="2026-04-22 18:43:05.303444806 +0000 UTC m=+21.543077175" watchObservedRunningTime="2026-04-22 18:43:05.303910313 +0000 UTC m=+21.543542673" Apr 22 18:43:06.189679 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:06.189657 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:06.189823 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:06.189661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:06.189823 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:06.189784 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:43:06.189940 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:06.189834 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:43:07.084927 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:07.084887 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-985jk" Apr 22 18:43:07.085952 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:07.085932 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-985jk" Apr 22 18:43:07.288310 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:07.288281 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 18:43:07.288607 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:07.288574 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" event={"ID":"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c","Type":"ContainerStarted","Data":"6770cf3934829245794d8a61cd0ce1e91b4063aca75b6d10199338d9508417ac"} Apr 22 18:43:07.290455 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:07.290428 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" event={"ID":"e7e79aca-b413-4120-897d-95784a08f56f","Type":"ContainerStarted","Data":"9f7ac0e3201a96b727c333d811298dee1d8f4afc4bea02050055264bffd7d961"} Apr 22 18:43:07.290646 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:07.290628 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-985jk" Apr 22 18:43:07.291124 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:07.291108 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-985jk" Apr 22 18:43:07.309549 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:07.309498 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-26nvf" podStartSLOduration=4.03235057 podStartE2EDuration="23.309482665s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:42:46.900987015 +0000 UTC m=+3.140619355" lastFinishedPulling="2026-04-22 18:43:06.178119108 +0000 UTC m=+22.417751450" observedRunningTime="2026-04-22 18:43:07.308636457 +0000 UTC m=+23.548268839" watchObservedRunningTime="2026-04-22 18:43:07.309482665 +0000 UTC m=+23.549115024" Apr 22 18:43:08.189682 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:08.189646 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:08.189682 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:08.189667 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:08.190135 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:08.189784 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:43:08.190135 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:08.189894 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:43:10.190001 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:10.189807 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:10.190495 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:10.189853 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:10.190495 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:10.190085 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:43:10.190495 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:10.190187 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:43:10.297278 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:10.297245 2575 generic.go:358] "Generic (PLEG): container finished" podID="5e2e434e-269f-4708-b72e-607842cf2bd9" containerID="9cab8a297c86c5cc925c6ddde073c168237ea2fcdbb604009720588d69cd7af3" exitCode=0 Apr 22 18:43:10.297409 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:10.297325 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6x6wq" event={"ID":"5e2e434e-269f-4708-b72e-607842cf2bd9","Type":"ContainerDied","Data":"9cab8a297c86c5cc925c6ddde073c168237ea2fcdbb604009720588d69cd7af3"} Apr 22 18:43:10.300330 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:10.300250 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 18:43:10.300578 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:10.300558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" event={"ID":"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c","Type":"ContainerStarted","Data":"d6b4c1261103b11d45f808c90edbff140bd488367ef4680024a534f95e3b9471"} Apr 22 18:43:10.300982 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:10.300965 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:43:10.301097 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:10.300990 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:43:10.301097 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:10.301076 2575 scope.go:117] "RemoveContainer" containerID="206dd2d4e141cc5b93c250dbdae6b86632670f8eb954de437d064bac4fdc57e4" Apr 22 18:43:10.321087 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:10.321064 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:43:11.307962 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:11.307932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 18:43:11.308350 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:11.308273 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" event={"ID":"ea3f4bad-3513-4bfe-9cd3-e706b42dc86c","Type":"ContainerStarted","Data":"c5ffd8a5f6e0cd0981df0696272d12f51f998fa8302c00dffe15cf6f8ee128a2"} Apr 22 18:43:11.308756 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:11.308734 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:43:11.324822 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:11.324612 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:43:11.348720 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:11.348676 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" podStartSLOduration=10.582682736 podStartE2EDuration="27.348662232s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:42:46.896106558 +0000 UTC m=+3.135738910" lastFinishedPulling="2026-04-22 18:43:03.662086055 +0000 UTC m=+19.901718406" observedRunningTime="2026-04-22 18:43:11.34847678 +0000 UTC m=+27.588109140" watchObservedRunningTime="2026-04-22 18:43:11.348662232 +0000 UTC m=+27.588294605" Apr 22 18:43:11.748842 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:11.748803 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ql9wt"] Apr 22 18:43:11.749035 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:11.748929 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:11.749123 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:11.749085 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:43:11.749486 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:11.749458 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7zmbr"] Apr 22 18:43:11.749605 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:11.749574 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:11.749694 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:11.749673 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:43:12.311829 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:12.311797 2575 generic.go:358] "Generic (PLEG): container finished" podID="5e2e434e-269f-4708-b72e-607842cf2bd9" containerID="c265f632aa59128df5e4898bb20952aac9da29862315604253b9f335bc19125e" exitCode=0 Apr 22 18:43:12.312491 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:12.311876 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6x6wq" event={"ID":"5e2e434e-269f-4708-b72e-607842cf2bd9","Type":"ContainerDied","Data":"c265f632aa59128df5e4898bb20952aac9da29862315604253b9f335bc19125e"} Apr 22 18:43:13.189577 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:13.189551 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:13.189718 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:13.189578 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:13.189718 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:13.189651 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:43:13.189851 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:13.189748 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:43:14.316862 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:14.316740 2575 generic.go:358] "Generic (PLEG): container finished" podID="5e2e434e-269f-4708-b72e-607842cf2bd9" containerID="38d9bd77bda231676af1a7de44f87474d8ebff8448401364dab5de99b87e9c67" exitCode=0 Apr 22 18:43:14.316862 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:14.316823 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6x6wq" event={"ID":"5e2e434e-269f-4708-b72e-607842cf2bd9","Type":"ContainerDied","Data":"38d9bd77bda231676af1a7de44f87474d8ebff8448401364dab5de99b87e9c67"} Apr 22 18:43:15.189534 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:15.189502 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:15.189706 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:15.189516 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:15.189706 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:15.189631 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ql9wt" podUID="3ff2e0de-9129-4246-968b-183ec5c37452" Apr 22 18:43:15.189859 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:15.189792 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:43:16.634580 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.634506 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-32.ec2.internal" event="NodeReady" Apr 22 18:43:16.634997 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.634659 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:43:16.683547 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.683514 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sn54r"] Apr 22 18:43:16.702283 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.702255 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-grs9r"] Apr 22 18:43:16.702421 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.702404 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:16.704763 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.704641 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:43:16.704763 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.704669 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:43:16.704763 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.704670 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrbrj\"" Apr 22 18:43:16.707519 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.705876 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:43:16.720371 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.720350 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sn54r"] Apr 22 18:43:16.720371 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.720375 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-grs9r"] Apr 22 18:43:16.720520 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.720464 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.722784 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.722746 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:43:16.722875 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.722827 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:43:16.722995 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.722828 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5kx7g\"" Apr 22 18:43:16.841053 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.841017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.841053 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.841058 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dzlf\" (UniqueName: \"kubernetes.io/projected/dcba4051-c58c-4ba8-baba-853741840882-kube-api-access-5dzlf\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:16.841286 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.841086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28j8f\" (UniqueName: \"kubernetes.io/projected/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-kube-api-access-28j8f\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.841286 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.841109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:16.841286 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.841140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-config-volume\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.841286 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.841214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-tmp-dir\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.942325 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.942246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-config-volume\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.942325 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.942307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-tmp-dir\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.942534 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.942353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.942534 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.942374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dzlf\" (UniqueName: \"kubernetes.io/projected/dcba4051-c58c-4ba8-baba-853741840882-kube-api-access-5dzlf\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:16.942534 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.942398 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28j8f\" (UniqueName: \"kubernetes.io/projected/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-kube-api-access-28j8f\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.942534 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.942424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:16.942534 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:16.942493 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:16.942534 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:16.942525 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:16.942808 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:16.942574 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls podName:6922ad30-ba0a-4bf8-b384-cdf6a0514c3a nodeName:}" failed. No retries permitted until 2026-04-22 18:43:17.442552907 +0000 UTC m=+33.682185248 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls") pod "dns-default-grs9r" (UID: "6922ad30-ba0a-4bf8-b384-cdf6a0514c3a") : secret "dns-default-metrics-tls" not found Apr 22 18:43:16.942808 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:16.942594 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert podName:dcba4051-c58c-4ba8-baba-853741840882 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:17.442584588 +0000 UTC m=+33.682216937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert") pod "ingress-canary-sn54r" (UID: "dcba4051-c58c-4ba8-baba-853741840882") : secret "canary-serving-cert" not found Apr 22 18:43:16.942808 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.942724 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-tmp-dir\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.943010 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.942989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-config-volume\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.953153 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.953011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28j8f\" (UniqueName: \"kubernetes.io/projected/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-kube-api-access-28j8f\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:16.953288 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:16.953034 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dzlf\" (UniqueName: \"kubernetes.io/projected/dcba4051-c58c-4ba8-baba-853741840882-kube-api-access-5dzlf\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:17.189916 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.189886 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:17.190090 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.189929 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:17.195107 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.195019 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:43:17.195107 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.195071 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7gnxr\"" Apr 22 18:43:17.195107 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.195092 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:43:17.195318 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.195027 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:43:17.195318 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.195027 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xrqzc\"" Apr 22 18:43:17.446170 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.446085 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:17.446170 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.446140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:17.446387 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:17.446196 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:17.446387 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:17.446262 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls podName:6922ad30-ba0a-4bf8-b384-cdf6a0514c3a nodeName:}" failed. No retries permitted until 2026-04-22 18:43:18.446242225 +0000 UTC m=+34.685874571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls") pod "dns-default-grs9r" (UID: "6922ad30-ba0a-4bf8-b384-cdf6a0514c3a") : secret "dns-default-metrics-tls" not found Apr 22 18:43:17.446387 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:17.446267 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:17.446387 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:17.446328 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert podName:dcba4051-c58c-4ba8-baba-853741840882 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:18.446310233 +0000 UTC m=+34.685942577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert") pod "ingress-canary-sn54r" (UID: "dcba4051-c58c-4ba8-baba-853741840882") : secret "canary-serving-cert" not found Apr 22 18:43:17.950566 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.950516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:17.951072 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.950578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xjc\" (UniqueName: \"kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc\") pod \"network-check-target-ql9wt\" (UID: \"3ff2e0de-9129-4246-968b-183ec5c37452\") " pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:17.951072 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:17.950674 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:43:17.951072 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:17.950756 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs podName:19ace946-23b0-451c-93fa-078938130dd5 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:49.950733723 +0000 UTC m=+66.190366079 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs") pod "network-metrics-daemon-7zmbr" (UID: "19ace946-23b0-451c-93fa-078938130dd5") : secret "metrics-daemon-secret" not found Apr 22 18:43:17.954209 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:17.954185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8xjc\" (UniqueName: \"kubernetes.io/projected/3ff2e0de-9129-4246-968b-183ec5c37452-kube-api-access-q8xjc\") pod \"network-check-target-ql9wt\" (UID: \"3ff2e0de-9129-4246-968b-183ec5c37452\") " pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:18.107184 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:18.107148 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:18.454004 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:18.453968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:18.454286 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:18.454071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:18.454286 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:18.454132 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:18.454286 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:18.454169 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:18.454286 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:18.454211 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert podName:dcba4051-c58c-4ba8-baba-853741840882 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:20.454188951 +0000 UTC m=+36.693821304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert") pod "ingress-canary-sn54r" (UID: "dcba4051-c58c-4ba8-baba-853741840882") : secret "canary-serving-cert" not found Apr 22 18:43:18.454286 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:18.454229 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls podName:6922ad30-ba0a-4bf8-b384-cdf6a0514c3a nodeName:}" failed. No retries permitted until 2026-04-22 18:43:20.454223328 +0000 UTC m=+36.693855668 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls") pod "dns-default-grs9r" (UID: "6922ad30-ba0a-4bf8-b384-cdf6a0514c3a") : secret "dns-default-metrics-tls" not found Apr 22 18:43:20.070118 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:20.070086 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ql9wt"] Apr 22 18:43:20.074732 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:43:20.074696 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ff2e0de_9129_4246_968b_183ec5c37452.slice/crio-28a92573814b14000af1fefb71f1589bd5a99547406371822b29193dd3f0b5f0 WatchSource:0}: Error finding container 28a92573814b14000af1fefb71f1589bd5a99547406371822b29193dd3f0b5f0: Status 404 returned error can't find the container with id 28a92573814b14000af1fefb71f1589bd5a99547406371822b29193dd3f0b5f0 Apr 22 18:43:20.330155 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:20.330114 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ql9wt" event={"ID":"3ff2e0de-9129-4246-968b-183ec5c37452","Type":"ContainerStarted","Data":"28a92573814b14000af1fefb71f1589bd5a99547406371822b29193dd3f0b5f0"} Apr 22 18:43:20.467412 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:20.467221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:20.467572 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:20.467452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:20.467572 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:20.467362 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:20.467572 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:20.467520 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert podName:dcba4051-c58c-4ba8-baba-853741840882 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:24.467504265 +0000 UTC m=+40.707136605 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert") pod "ingress-canary-sn54r" (UID: "dcba4051-c58c-4ba8-baba-853741840882") : secret "canary-serving-cert" not found Apr 22 18:43:20.467572 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:20.467536 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:20.467815 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:20.467586 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls podName:6922ad30-ba0a-4bf8-b384-cdf6a0514c3a nodeName:}" failed. No retries permitted until 2026-04-22 18:43:24.467570237 +0000 UTC m=+40.707202575 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls") pod "dns-default-grs9r" (UID: "6922ad30-ba0a-4bf8-b384-cdf6a0514c3a") : secret "dns-default-metrics-tls" not found Apr 22 18:43:21.334600 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:21.334517 2575 generic.go:358] "Generic (PLEG): container finished" podID="5e2e434e-269f-4708-b72e-607842cf2bd9" containerID="33afb91edd9e1a9d8cbd45ed311cdef32971a16e85305623a8d4fd981c821b10" exitCode=0 Apr 22 18:43:21.334993 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:21.334617 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6x6wq" event={"ID":"5e2e434e-269f-4708-b72e-607842cf2bd9","Type":"ContainerDied","Data":"33afb91edd9e1a9d8cbd45ed311cdef32971a16e85305623a8d4fd981c821b10"} Apr 22 18:43:22.339677 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:22.339637 2575 generic.go:358] "Generic (PLEG): container finished" podID="5e2e434e-269f-4708-b72e-607842cf2bd9" containerID="6d5e40e5ad7450385ef70454315698f72420ee15df8f793e9c078f78e5008bca" exitCode=0 Apr 22 18:43:22.340145 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:22.339709 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6x6wq" event={"ID":"5e2e434e-269f-4708-b72e-607842cf2bd9","Type":"ContainerDied","Data":"6d5e40e5ad7450385ef70454315698f72420ee15df8f793e9c078f78e5008bca"} Apr 22 18:43:23.344702 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:23.344665 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6x6wq" event={"ID":"5e2e434e-269f-4708-b72e-607842cf2bd9","Type":"ContainerStarted","Data":"68ff07b7e0f1414e0efefb695a698c5b4413ef2d027e1e6dd27552a55e36d11e"} Apr 22 18:43:23.345814 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:23.345792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ql9wt" event={"ID":"3ff2e0de-9129-4246-968b-183ec5c37452","Type":"ContainerStarted","Data":"39fc028cac1865de52e49d7af13595a59e7b817d5b48831e396256ba1d47fdae"} Apr 22 18:43:23.345949 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:23.345903 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:43:23.367880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:23.367829 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6x6wq" podStartSLOduration=6.000952821 podStartE2EDuration="39.367817622s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:42:46.90443632 +0000 UTC m=+3.144068656" lastFinishedPulling="2026-04-22 18:43:20.271301116 +0000 UTC m=+36.510933457" observedRunningTime="2026-04-22 18:43:23.367186996 +0000 UTC m=+39.606819351" watchObservedRunningTime="2026-04-22 18:43:23.367817622 +0000 UTC m=+39.607450026" Apr 22 18:43:23.382491 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:23.382451 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ql9wt" podStartSLOduration=36.56816249 podStartE2EDuration="39.382436831s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:43:20.077020479 +0000 UTC m=+36.316652830" lastFinishedPulling="2026-04-22 18:43:22.89129483 +0000 UTC m=+39.130927171" observedRunningTime="2026-04-22 18:43:23.381691291 +0000 UTC m=+39.621323651" watchObservedRunningTime="2026-04-22 18:43:23.382436831 +0000 UTC m=+39.622069190" Apr 22 18:43:24.494762 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:24.494724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:24.495206 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:24.494877 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:24.495206 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:24.494947 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert podName:dcba4051-c58c-4ba8-baba-853741840882 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:32.494929032 +0000 UTC m=+48.734561391 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert") pod "ingress-canary-sn54r" (UID: "dcba4051-c58c-4ba8-baba-853741840882") : secret "canary-serving-cert" not found Apr 22 18:43:24.495206 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:24.494999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:24.495206 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:24.495084 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:24.495206 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:24.495148 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls podName:6922ad30-ba0a-4bf8-b384-cdf6a0514c3a nodeName:}" failed. No retries permitted until 2026-04-22 18:43:32.49512978 +0000 UTC m=+48.734762120 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls") pod "dns-default-grs9r" (UID: "6922ad30-ba0a-4bf8-b384-cdf6a0514c3a") : secret "dns-default-metrics-tls" not found Apr 22 18:43:32.543737 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:32.543700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:32.543737 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:32.543740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:32.544157 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:32.543849 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:32.544157 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:32.543851 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:32.544157 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:32.543899 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert podName:dcba4051-c58c-4ba8-baba-853741840882 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:48.543885382 +0000 UTC m=+64.783517719 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert") pod "ingress-canary-sn54r" (UID: "dcba4051-c58c-4ba8-baba-853741840882") : secret "canary-serving-cert" not found Apr 22 18:43:32.544157 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:32.543911 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls podName:6922ad30-ba0a-4bf8-b384-cdf6a0514c3a nodeName:}" failed. No retries permitted until 2026-04-22 18:43:48.543905301 +0000 UTC m=+64.783537639 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls") pod "dns-default-grs9r" (UID: "6922ad30-ba0a-4bf8-b384-cdf6a0514c3a") : secret "dns-default-metrics-tls" not found Apr 22 18:43:43.323154 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:43.323126 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hwf7s" Apr 22 18:43:48.549515 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:48.549479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:43:48.549515 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:48.549522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:43:48.549943 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:48.549616 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:48.549943 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:48.549618 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:48.549943 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:48.549679 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls podName:6922ad30-ba0a-4bf8-b384-cdf6a0514c3a nodeName:}" failed. No retries permitted until 2026-04-22 18:44:20.549664833 +0000 UTC m=+96.789297170 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls") pod "dns-default-grs9r" (UID: "6922ad30-ba0a-4bf8-b384-cdf6a0514c3a") : secret "dns-default-metrics-tls" not found Apr 22 18:43:48.549943 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:48.549693 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert podName:dcba4051-c58c-4ba8-baba-853741840882 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:20.549686699 +0000 UTC m=+96.789319036 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert") pod "ingress-canary-sn54r" (UID: "dcba4051-c58c-4ba8-baba-853741840882") : secret "canary-serving-cert" not found Apr 22 18:43:49.960935 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:49.960892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:43:49.961312 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:49.961001 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:43:49.961312 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:43:49.961051 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs podName:19ace946-23b0-451c-93fa-078938130dd5 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:53.96103844 +0000 UTC m=+130.200670777 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs") pod "network-metrics-daemon-7zmbr" (UID: "19ace946-23b0-451c-93fa-078938130dd5") : secret "metrics-daemon-secret" not found Apr 22 18:43:54.349915 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:43:54.349805 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ql9wt" Apr 22 18:44:20.571445 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:20.571400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:44:20.572045 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:20.571465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:44:20.572045 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:20.571539 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:20.572045 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:20.571551 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:20.572045 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:20.571607 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls podName:6922ad30-ba0a-4bf8-b384-cdf6a0514c3a nodeName:}" failed. No retries permitted until 2026-04-22 18:45:24.571593723 +0000 UTC m=+160.811226059 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls") pod "dns-default-grs9r" (UID: "6922ad30-ba0a-4bf8-b384-cdf6a0514c3a") : secret "dns-default-metrics-tls" not found Apr 22 18:44:20.572045 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:20.571622 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert podName:dcba4051-c58c-4ba8-baba-853741840882 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:24.57161603 +0000 UTC m=+160.811248368 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert") pod "ingress-canary-sn54r" (UID: "dcba4051-c58c-4ba8-baba-853741840882") : secret "canary-serving-cert" not found Apr 22 18:44:42.640289 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.640250 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-shk2x"] Apr 22 18:44:42.641957 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.641936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-shk2x" Apr 22 18:44:42.644199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.644177 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:44:42.644302 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.644230 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 18:44:42.645158 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.645143 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-z8q6v\"" Apr 22 18:44:42.651874 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.651854 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-shk2x"] Apr 22 18:44:42.718083 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.718048 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw7gc\" (UniqueName: \"kubernetes.io/projected/119686fb-62fb-4304-be1e-ea9e264fd21d-kube-api-access-pw7gc\") pod \"volume-data-source-validator-7c6cbb6c87-shk2x\" (UID: \"119686fb-62fb-4304-be1e-ea9e264fd21d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-shk2x" Apr 22 18:44:42.818390 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.818355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw7gc\" (UniqueName: \"kubernetes.io/projected/119686fb-62fb-4304-be1e-ea9e264fd21d-kube-api-access-pw7gc\") pod \"volume-data-source-validator-7c6cbb6c87-shk2x\" (UID: \"119686fb-62fb-4304-be1e-ea9e264fd21d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-shk2x" Apr 22 18:44:42.828066 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.828043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw7gc\" (UniqueName: \"kubernetes.io/projected/119686fb-62fb-4304-be1e-ea9e264fd21d-kube-api-access-pw7gc\") pod \"volume-data-source-validator-7c6cbb6c87-shk2x\" (UID: \"119686fb-62fb-4304-be1e-ea9e264fd21d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-shk2x" Apr 22 18:44:42.848869 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.848842 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl"] Apr 22 18:44:42.850333 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.850319 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7cc989c66-cc7nk"] Apr 22 18:44:42.850476 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.850457 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:42.854366 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.854252 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:44:42.854921 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.854903 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:42.855155 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.855133 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fdghj\"" Apr 22 18:44:42.855341 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.855318 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 18:44:42.855515 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.855504 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:44:42.856090 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.856069 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 18:44:42.857014 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.856996 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 18:44:42.857161 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.857148 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 18:44:42.857287 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.857189 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-jrz95\"" Apr 22 18:44:42.857394 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.857377 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:44:42.857800 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.857761 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 18:44:42.857800 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.857796 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:44:42.858057 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.858040 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 18:44:42.861353 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.861331 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl"] Apr 22 18:44:42.863906 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.863879 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7cc989c66-cc7nk"] Apr 22 18:44:42.919434 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.919365 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-stats-auth\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:42.919434 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.919396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:42.919434 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.919421 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-default-certificate\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:42.919617 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.919441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:42.919617 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.919458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:42.919617 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.919478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7gj4\" (UniqueName: \"kubernetes.io/projected/309b4d32-03cc-43ef-b0f0-8f772378a81a-kube-api-access-f7gj4\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:42.919617 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.919514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsmpr\" (UniqueName: \"kubernetes.io/projected/d296e50b-a805-4e1b-9297-f74fb4549ed5-kube-api-access-rsmpr\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:42.919617 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.919569 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/309b4d32-03cc-43ef-b0f0-8f772378a81a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:42.950355 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:42.950330 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-shk2x" Apr 22 18:44:43.021955 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.020264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-stats-auth\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:43.021955 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.020317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:43.021955 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.020354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-default-certificate\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:43.021955 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.020379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:43.021955 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.020412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:43.021955 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.020447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7gj4\" (UniqueName: \"kubernetes.io/projected/309b4d32-03cc-43ef-b0f0-8f772378a81a-kube-api-access-f7gj4\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:43.021955 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.020500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsmpr\" (UniqueName: \"kubernetes.io/projected/d296e50b-a805-4e1b-9297-f74fb4549ed5-kube-api-access-rsmpr\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:43.021955 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.020540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/309b4d32-03cc-43ef-b0f0-8f772378a81a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:43.021955 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.021587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/309b4d32-03cc-43ef-b0f0-8f772378a81a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:43.023285 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:43.023265 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:43.523244781 +0000 UTC m=+119.762877126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : configmap references non-existent config key: service-ca.crt Apr 22 18:44:43.023761 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:43.023743 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:44:43.023942 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:43.023930 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:43.523913782 +0000 UTC m=+119.763546133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : secret "router-metrics-certs-default" not found Apr 22 18:44:43.024604 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:43.024588 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:43.024754 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:43.024744 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls podName:309b4d32-03cc-43ef-b0f0-8f772378a81a nodeName:}" failed. No retries permitted until 2026-04-22 18:44:43.52472903 +0000 UTC m=+119.764361372 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p7ksl" (UID: "309b4d32-03cc-43ef-b0f0-8f772378a81a") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:43.032382 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.032334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-stats-auth\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:43.032382 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.032348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-default-certificate\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:43.033854 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.033808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsmpr\" (UniqueName: \"kubernetes.io/projected/d296e50b-a805-4e1b-9297-f74fb4549ed5-kube-api-access-rsmpr\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:43.035341 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.035316 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7gj4\" (UniqueName: \"kubernetes.io/projected/309b4d32-03cc-43ef-b0f0-8f772378a81a-kube-api-access-f7gj4\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:43.079789 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.079746 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-shk2x"] Apr 22 18:44:43.083676 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:44:43.083647 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod119686fb_62fb_4304_be1e_ea9e264fd21d.slice/crio-e1cb00f22a5d9d37d366544f7111aaa8d15bd7f7e59a67e9f9534f61e4917b4b WatchSource:0}: Error finding container e1cb00f22a5d9d37d366544f7111aaa8d15bd7f7e59a67e9f9534f61e4917b4b: Status 404 returned error can't find the container with id e1cb00f22a5d9d37d366544f7111aaa8d15bd7f7e59a67e9f9534f61e4917b4b Apr 22 18:44:43.489410 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.489376 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-shk2x" event={"ID":"119686fb-62fb-4304-be1e-ea9e264fd21d","Type":"ContainerStarted","Data":"e1cb00f22a5d9d37d366544f7111aaa8d15bd7f7e59a67e9f9534f61e4917b4b"} Apr 22 18:44:43.524873 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.524844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:43.524974 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.524878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:43.524974 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:43.524897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:43.525062 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:43.524986 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:43.525062 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:43.524987 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:44:43.525062 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:43.525035 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:44.525018041 +0000 UTC m=+120.764650378 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : configmap references non-existent config key: service-ca.crt Apr 22 18:44:43.525062 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:43.525055 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:44.525047939 +0000 UTC m=+120.764680277 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : secret "router-metrics-certs-default" not found Apr 22 18:44:43.525193 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:43.525065 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls podName:309b4d32-03cc-43ef-b0f0-8f772378a81a nodeName:}" failed. No retries permitted until 2026-04-22 18:44:44.525060055 +0000 UTC m=+120.764692391 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p7ksl" (UID: "309b4d32-03cc-43ef-b0f0-8f772378a81a") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:44.492634 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:44.492593 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-shk2x" event={"ID":"119686fb-62fb-4304-be1e-ea9e264fd21d","Type":"ContainerStarted","Data":"9acfaef623e041df4142838d4ecd57f26a3fb5f55373307515220076cf3d561a"} Apr 22 18:44:44.511995 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:44.511947 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-shk2x" podStartSLOduration=1.167747492 podStartE2EDuration="2.511933186s" podCreationTimestamp="2026-04-22 18:44:42 +0000 UTC" firstStartedPulling="2026-04-22 18:44:43.085307464 +0000 UTC m=+119.324939801" lastFinishedPulling="2026-04-22 18:44:44.429493153 +0000 UTC m=+120.669125495" observedRunningTime="2026-04-22 18:44:44.510983731 +0000 UTC m=+120.750616091" watchObservedRunningTime="2026-04-22 18:44:44.511933186 +0000 UTC m=+120.751565545" Apr 22 18:44:44.531553 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:44.531523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:44.531672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:44.531558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:44.531672 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:44.531581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:44.531750 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:44.531671 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:44.531750 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:44.531714 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls podName:309b4d32-03cc-43ef-b0f0-8f772378a81a nodeName:}" failed. No retries permitted until 2026-04-22 18:44:46.531702137 +0000 UTC m=+122.771334473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p7ksl" (UID: "309b4d32-03cc-43ef-b0f0-8f772378a81a") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:44.531750 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:44.531671 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:44:44.531874 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:44.531794 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:46.531763253 +0000 UTC m=+122.771395605 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : secret "router-metrics-certs-default" not found Apr 22 18:44:44.531874 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:44.531813 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:46.531800002 +0000 UTC m=+122.771432340 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : configmap references non-existent config key: service-ca.crt Apr 22 18:44:46.543893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:46.543857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:46.543893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:46.543894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:46.544306 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:46.543962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:46.544306 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:46.544019 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:50.543997171 +0000 UTC m=+126.783629520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : configmap references non-existent config key: service-ca.crt Apr 22 18:44:46.544306 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:46.544043 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:44:46.544306 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:46.544055 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:46.544306 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:46.544081 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:50.544069238 +0000 UTC m=+126.783701575 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : secret "router-metrics-certs-default" not found Apr 22 18:44:46.544306 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:46.544096 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls podName:309b4d32-03cc-43ef-b0f0-8f772378a81a nodeName:}" failed. No retries permitted until 2026-04-22 18:44:50.544084831 +0000 UTC m=+126.783717182 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p7ksl" (UID: "309b4d32-03cc-43ef-b0f0-8f772378a81a") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:48.119650 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.119618 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5jr6w_3bf65c2b-0944-4d58-bd8b-923617359ff3/dns-node-resolver/0.log" Apr 22 18:44:48.689880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.689852 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2r8wk"] Apr 22 18:44:48.691500 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.691485 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.693996 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.693971 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 18:44:48.694111 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.694044 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:44:48.694111 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.693976 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-fgtsc\"" Apr 22 18:44:48.694928 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.694908 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 18:44:48.695029 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.694918 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 18:44:48.701258 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.701232 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2r8wk"] Apr 22 18:44:48.703017 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.702995 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 18:44:48.761932 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.761906 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03550605-e0bb-4434-8e90-08b3aecc5a4c-config\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.762053 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.761938 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03550605-e0bb-4434-8e90-08b3aecc5a4c-serving-cert\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.762053 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.761957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03550605-e0bb-4434-8e90-08b3aecc5a4c-trusted-ca\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.762053 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.762035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w56r2\" (UniqueName: \"kubernetes.io/projected/03550605-e0bb-4434-8e90-08b3aecc5a4c-kube-api-access-w56r2\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.862950 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.862920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03550605-e0bb-4434-8e90-08b3aecc5a4c-config\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.863029 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.862955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03550605-e0bb-4434-8e90-08b3aecc5a4c-serving-cert\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.863029 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.862972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03550605-e0bb-4434-8e90-08b3aecc5a4c-trusted-ca\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.863029 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.863012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w56r2\" (UniqueName: \"kubernetes.io/projected/03550605-e0bb-4434-8e90-08b3aecc5a4c-kube-api-access-w56r2\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.863498 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.863469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03550605-e0bb-4434-8e90-08b3aecc5a4c-config\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.863682 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.863664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03550605-e0bb-4434-8e90-08b3aecc5a4c-trusted-ca\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.865206 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.865187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03550605-e0bb-4434-8e90-08b3aecc5a4c-serving-cert\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:48.871328 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:48.871305 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w56r2\" (UniqueName: \"kubernetes.io/projected/03550605-e0bb-4434-8e90-08b3aecc5a4c-kube-api-access-w56r2\") pod \"console-operator-9d4b6777b-2r8wk\" (UID: \"03550605-e0bb-4434-8e90-08b3aecc5a4c\") " pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:49.005499 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:49.005475 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:49.111961 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:49.111931 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2r8wk"] Apr 22 18:44:49.114451 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:49.114432 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h7ks7_cc6477a3-da8c-40f7-ae67-bf32ede541af/node-ca/0.log" Apr 22 18:44:49.114821 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:44:49.114795 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03550605_e0bb_4434_8e90_08b3aecc5a4c.slice/crio-352027a6fbaa106bd0295b95fdc12b93e072b7b241c6d007304076ce0c9d6f03 WatchSource:0}: Error finding container 352027a6fbaa106bd0295b95fdc12b93e072b7b241c6d007304076ce0c9d6f03: Status 404 returned error can't find the container with id 352027a6fbaa106bd0295b95fdc12b93e072b7b241c6d007304076ce0c9d6f03 Apr 22 18:44:49.503372 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:49.503332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" event={"ID":"03550605-e0bb-4434-8e90-08b3aecc5a4c","Type":"ContainerStarted","Data":"352027a6fbaa106bd0295b95fdc12b93e072b7b241c6d007304076ce0c9d6f03"} Apr 22 18:44:50.575472 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:50.575440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:50.576005 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:50.575488 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:50.576005 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:50.575585 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:44:50.576005 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:50.575606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:50.576005 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:50.575638 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:58.575623996 +0000 UTC m=+134.815256338 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : secret "router-metrics-certs-default" not found Apr 22 18:44:50.576005 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:50.575657 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:58.575651115 +0000 UTC m=+134.815283451 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : configmap references non-existent config key: service-ca.crt Apr 22 18:44:50.576005 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:50.575676 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:50.576005 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:50.575710 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls podName:309b4d32-03cc-43ef-b0f0-8f772378a81a nodeName:}" failed. No retries permitted until 2026-04-22 18:44:58.575699853 +0000 UTC m=+134.815332191 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p7ksl" (UID: "309b4d32-03cc-43ef-b0f0-8f772378a81a") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:51.507555 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:51.507530 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" event={"ID":"03550605-e0bb-4434-8e90-08b3aecc5a4c","Type":"ContainerStarted","Data":"290850c0d4a88ad4367f3a88d553a2335f0fce988472342140553ef95fb45a42"} Apr 22 18:44:51.507735 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:51.507718 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:51.509043 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:51.509018 2575 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-2r8wk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.134.0.10:8443/readyz\": dial tcp 10.134.0.10:8443: connect: connection refused" start-of-body= Apr 22 18:44:51.509118 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:51.509060 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" podUID="03550605-e0bb-4434-8e90-08b3aecc5a4c" containerName="console-operator" probeResult="failure" output="Get \"https://10.134.0.10:8443/readyz\": dial tcp 10.134.0.10:8443: connect: connection refused" Apr 22 18:44:51.523424 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:51.523200 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" podStartSLOduration=1.20353238 podStartE2EDuration="3.523188635s" podCreationTimestamp="2026-04-22 18:44:48 +0000 UTC" firstStartedPulling="2026-04-22 18:44:49.116599908 +0000 UTC m=+125.356232245" lastFinishedPulling="2026-04-22 18:44:51.436256163 +0000 UTC m=+127.675888500" observedRunningTime="2026-04-22 18:44:51.52270251 +0000 UTC m=+127.762334868" watchObservedRunningTime="2026-04-22 18:44:51.523188635 +0000 UTC m=+127.762820994" Apr 22 18:44:52.510415 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.510388 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/0.log" Apr 22 18:44:52.510913 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.510425 2575 generic.go:358] "Generic (PLEG): container finished" podID="03550605-e0bb-4434-8e90-08b3aecc5a4c" containerID="290850c0d4a88ad4367f3a88d553a2335f0fce988472342140553ef95fb45a42" exitCode=255 Apr 22 18:44:52.510913 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.510449 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" event={"ID":"03550605-e0bb-4434-8e90-08b3aecc5a4c","Type":"ContainerDied","Data":"290850c0d4a88ad4367f3a88d553a2335f0fce988472342140553ef95fb45a42"} Apr 22 18:44:52.510913 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.510768 2575 scope.go:117] "RemoveContainer" containerID="290850c0d4a88ad4367f3a88d553a2335f0fce988472342140553ef95fb45a42" Apr 22 18:44:52.795648 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.795574 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq"] Apr 22 18:44:52.797235 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.797220 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" Apr 22 18:44:52.799718 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.799694 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 18:44:52.799871 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.799802 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-5lkfh\"" Apr 22 18:44:52.799982 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.799965 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 18:44:52.800825 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.800799 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:44:52.800825 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.800799 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 18:44:52.811029 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.811006 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq"] Apr 22 18:44:52.896400 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.896358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-x4rmq\" (UID: \"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" Apr 22 18:44:52.896570 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.896497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-x4rmq\" (UID: \"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" Apr 22 18:44:52.896570 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.896532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbfcx\" (UniqueName: \"kubernetes.io/projected/3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb-kube-api-access-lbfcx\") pod \"kube-storage-version-migrator-operator-6769c5d45-x4rmq\" (UID: \"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" Apr 22 18:44:52.997403 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.997355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-x4rmq\" (UID: \"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" Apr 22 18:44:52.997403 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.997406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbfcx\" (UniqueName: \"kubernetes.io/projected/3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb-kube-api-access-lbfcx\") pod \"kube-storage-version-migrator-operator-6769c5d45-x4rmq\" (UID: \"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" Apr 22 18:44:52.997619 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.997450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-x4rmq\" (UID: \"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" Apr 22 18:44:52.997920 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.997897 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-x4rmq\" (UID: \"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" Apr 22 18:44:52.999628 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:52.999613 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-x4rmq\" (UID: \"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" Apr 22 18:44:53.005695 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:53.005675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbfcx\" (UniqueName: \"kubernetes.io/projected/3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb-kube-api-access-lbfcx\") pod \"kube-storage-version-migrator-operator-6769c5d45-x4rmq\" (UID: \"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" Apr 22 18:44:53.105752 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:53.105660 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" Apr 22 18:44:53.220682 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:53.220651 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq"] Apr 22 18:44:53.223953 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:44:53.223928 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3efbd4c5_3c68_4fb4_8a66_b5731e17e5fb.slice/crio-1e836f59512339d9c2c3b15475d08267bf6a32a4e45f41745f415bb76d467ec8 WatchSource:0}: Error finding container 1e836f59512339d9c2c3b15475d08267bf6a32a4e45f41745f415bb76d467ec8: Status 404 returned error can't find the container with id 1e836f59512339d9c2c3b15475d08267bf6a32a4e45f41745f415bb76d467ec8 Apr 22 18:44:53.513384 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:53.513363 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/1.log" Apr 22 18:44:53.513785 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:53.513728 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/0.log" Apr 22 18:44:53.513828 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:53.513766 2575 generic.go:358] "Generic (PLEG): container finished" podID="03550605-e0bb-4434-8e90-08b3aecc5a4c" containerID="23a3bc3772c366af5063574a97d32ba89cbc82d5f8e8a3f0cfc2037a86abab1e" exitCode=255 Apr 22 18:44:53.513864 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:53.513840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" event={"ID":"03550605-e0bb-4434-8e90-08b3aecc5a4c","Type":"ContainerDied","Data":"23a3bc3772c366af5063574a97d32ba89cbc82d5f8e8a3f0cfc2037a86abab1e"} Apr 22 18:44:53.513900 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:53.513870 2575 scope.go:117] "RemoveContainer" containerID="290850c0d4a88ad4367f3a88d553a2335f0fce988472342140553ef95fb45a42" Apr 22 18:44:53.514138 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:53.514120 2575 scope.go:117] "RemoveContainer" containerID="23a3bc3772c366af5063574a97d32ba89cbc82d5f8e8a3f0cfc2037a86abab1e" Apr 22 18:44:53.514332 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:53.514306 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2r8wk_openshift-console-operator(03550605-e0bb-4434-8e90-08b3aecc5a4c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" podUID="03550605-e0bb-4434-8e90-08b3aecc5a4c" Apr 22 18:44:53.515079 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:53.514863 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" event={"ID":"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb","Type":"ContainerStarted","Data":"1e836f59512339d9c2c3b15475d08267bf6a32a4e45f41745f415bb76d467ec8"} Apr 22 18:44:54.004952 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.004923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:44:54.005107 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:54.005070 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:44:54.005159 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:54.005135 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs podName:19ace946-23b0-451c-93fa-078938130dd5 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:56.005120486 +0000 UTC m=+252.244752822 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs") pod "network-metrics-daemon-7zmbr" (UID: "19ace946-23b0-451c-93fa-078938130dd5") : secret "metrics-daemon-secret" not found Apr 22 18:44:54.401886 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.401817 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-t9c6n"] Apr 22 18:44:54.403316 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.403302 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9c6n" Apr 22 18:44:54.405581 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.405558 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-svdv2\"" Apr 22 18:44:54.412884 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.412863 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-t9c6n"] Apr 22 18:44:54.507964 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.507929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77xdp\" (UniqueName: \"kubernetes.io/projected/9446380b-9352-4cac-900c-67cae9df5d6d-kube-api-access-77xdp\") pod \"network-check-source-8894fc9bd-t9c6n\" (UID: \"9446380b-9352-4cac-900c-67cae9df5d6d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9c6n" Apr 22 18:44:54.518819 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.518791 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/1.log" Apr 22 18:44:54.519191 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.519144 2575 scope.go:117] "RemoveContainer" containerID="23a3bc3772c366af5063574a97d32ba89cbc82d5f8e8a3f0cfc2037a86abab1e" Apr 22 18:44:54.519345 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:54.519326 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2r8wk_openshift-console-operator(03550605-e0bb-4434-8e90-08b3aecc5a4c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" podUID="03550605-e0bb-4434-8e90-08b3aecc5a4c" Apr 22 18:44:54.609363 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.609288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77xdp\" (UniqueName: \"kubernetes.io/projected/9446380b-9352-4cac-900c-67cae9df5d6d-kube-api-access-77xdp\") pod \"network-check-source-8894fc9bd-t9c6n\" (UID: \"9446380b-9352-4cac-900c-67cae9df5d6d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9c6n" Apr 22 18:44:54.618486 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.618462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77xdp\" (UniqueName: \"kubernetes.io/projected/9446380b-9352-4cac-900c-67cae9df5d6d-kube-api-access-77xdp\") pod \"network-check-source-8894fc9bd-t9c6n\" (UID: \"9446380b-9352-4cac-900c-67cae9df5d6d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9c6n" Apr 22 18:44:54.713616 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.713582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9c6n" Apr 22 18:44:54.831566 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:54.831536 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-t9c6n"] Apr 22 18:44:54.835026 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:44:54.834982 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9446380b_9352_4cac_900c_67cae9df5d6d.slice/crio-ee294fcdfd11a76516dcf11651076702c0a70df71addf7a08338c827cbd714d1 WatchSource:0}: Error finding container ee294fcdfd11a76516dcf11651076702c0a70df71addf7a08338c827cbd714d1: Status 404 returned error can't find the container with id ee294fcdfd11a76516dcf11651076702c0a70df71addf7a08338c827cbd714d1 Apr 22 18:44:55.522950 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:55.522907 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9c6n" event={"ID":"9446380b-9352-4cac-900c-67cae9df5d6d","Type":"ContainerStarted","Data":"9d2998c298692bbae73dcb8cc511b66364bcdce37a03f08fc0e718c1089a3fb8"} Apr 22 18:44:55.523397 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:55.522956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9c6n" event={"ID":"9446380b-9352-4cac-900c-67cae9df5d6d","Type":"ContainerStarted","Data":"ee294fcdfd11a76516dcf11651076702c0a70df71addf7a08338c827cbd714d1"} Apr 22 18:44:55.552922 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:55.552865 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9c6n" podStartSLOduration=1.5528458729999999 podStartE2EDuration="1.552845873s" podCreationTimestamp="2026-04-22 18:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:44:55.552096728 +0000 UTC m=+131.791729108" watchObservedRunningTime="2026-04-22 18:44:55.552845873 +0000 UTC m=+131.792478232" Apr 22 18:44:56.527494 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:56.527455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" event={"ID":"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb","Type":"ContainerStarted","Data":"79fe45d00e5f51a14c7b489b8a5867deb757ffb11e6a215b782e065609c47b05"} Apr 22 18:44:56.544261 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:56.544218 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" podStartSLOduration=1.86274223 podStartE2EDuration="4.544205244s" podCreationTimestamp="2026-04-22 18:44:52 +0000 UTC" firstStartedPulling="2026-04-22 18:44:53.225860336 +0000 UTC m=+129.465492673" lastFinishedPulling="2026-04-22 18:44:55.90732335 +0000 UTC m=+132.146955687" observedRunningTime="2026-04-22 18:44:56.543686135 +0000 UTC m=+132.783318519" watchObservedRunningTime="2026-04-22 18:44:56.544205244 +0000 UTC m=+132.783837603" Apr 22 18:44:57.115687 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:57.115652 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv"] Apr 22 18:44:57.117761 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:57.117739 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv" Apr 22 18:44:57.120166 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:57.120145 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 18:44:57.121071 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:57.121051 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-pjdk9\"" Apr 22 18:44:57.121164 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:57.121086 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 18:44:57.127507 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:57.127485 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv"] Apr 22 18:44:57.231568 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:57.231543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9j8d\" (UniqueName: \"kubernetes.io/projected/e2533f65-e508-47ea-9cb7-8bb858479a89-kube-api-access-x9j8d\") pod \"migrator-74bb7799d9-cnpcv\" (UID: \"e2533f65-e508-47ea-9cb7-8bb858479a89\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv" Apr 22 18:44:57.332643 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:57.332601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9j8d\" (UniqueName: \"kubernetes.io/projected/e2533f65-e508-47ea-9cb7-8bb858479a89-kube-api-access-x9j8d\") pod \"migrator-74bb7799d9-cnpcv\" (UID: \"e2533f65-e508-47ea-9cb7-8bb858479a89\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv" Apr 22 18:44:57.341251 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:57.341220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9j8d\" (UniqueName: \"kubernetes.io/projected/e2533f65-e508-47ea-9cb7-8bb858479a89-kube-api-access-x9j8d\") pod \"migrator-74bb7799d9-cnpcv\" (UID: \"e2533f65-e508-47ea-9cb7-8bb858479a89\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv" Apr 22 18:44:57.426410 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:57.426335 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv" Apr 22 18:44:57.539410 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:57.539383 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv"] Apr 22 18:44:57.542434 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:44:57.542412 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2533f65_e508_47ea_9cb7_8bb858479a89.slice/crio-ce7526c45ed6a838b4406b61fef4ba124bc9781dd4aff72e5222491f473538fc WatchSource:0}: Error finding container ce7526c45ed6a838b4406b61fef4ba124bc9781dd4aff72e5222491f473538fc: Status 404 returned error can't find the container with id ce7526c45ed6a838b4406b61fef4ba124bc9781dd4aff72e5222491f473538fc Apr 22 18:44:58.534073 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:58.534038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv" event={"ID":"e2533f65-e508-47ea-9cb7-8bb858479a89","Type":"ContainerStarted","Data":"ce7526c45ed6a838b4406b61fef4ba124bc9781dd4aff72e5222491f473538fc"} Apr 22 18:44:58.643049 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:58.643005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:58.643503 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:58.643066 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:44:58.643503 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:58.643097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:44:58.643503 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:58.643151 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:44:58.643503 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:58.643192 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:14.643171673 +0000 UTC m=+150.882804038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : configmap references non-existent config key: service-ca.crt Apr 22 18:44:58.643503 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:58.643226 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs podName:d296e50b-a805-4e1b-9297-f74fb4549ed5 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:14.643213872 +0000 UTC m=+150.882846213 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs") pod "router-default-7cc989c66-cc7nk" (UID: "d296e50b-a805-4e1b-9297-f74fb4549ed5") : secret "router-metrics-certs-default" not found Apr 22 18:44:58.643503 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:58.643231 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:58.643503 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:58.643281 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls podName:309b4d32-03cc-43ef-b0f0-8f772378a81a nodeName:}" failed. No retries permitted until 2026-04-22 18:45:14.643266186 +0000 UTC m=+150.882898539 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-p7ksl" (UID: "309b4d32-03cc-43ef-b0f0-8f772378a81a") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:44:59.006555 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.006522 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:44:59.006914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.006901 2575 scope.go:117] "RemoveContainer" containerID="23a3bc3772c366af5063574a97d32ba89cbc82d5f8e8a3f0cfc2037a86abab1e" Apr 22 18:44:59.007065 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:59.007049 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2r8wk_openshift-console-operator(03550605-e0bb-4434-8e90-08b3aecc5a4c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" podUID="03550605-e0bb-4434-8e90-08b3aecc5a4c" Apr 22 18:44:59.538333 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.538293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv" event={"ID":"e2533f65-e508-47ea-9cb7-8bb858479a89","Type":"ContainerStarted","Data":"047e50fb84fa15562435fa2849bf10137ee292a95cd66e6e4650626c05406f57"} Apr 22 18:44:59.538333 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.538328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv" event={"ID":"e2533f65-e508-47ea-9cb7-8bb858479a89","Type":"ContainerStarted","Data":"eb4c5ff2545a499592b077f8aed56f265abddb74c0c0c01e5b13ce466f3f908f"} Apr 22 18:44:59.558122 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.558078 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-cnpcv" podStartSLOduration=1.298253072 podStartE2EDuration="2.558064607s" podCreationTimestamp="2026-04-22 18:44:57 +0000 UTC" firstStartedPulling="2026-04-22 18:44:57.544045516 +0000 UTC m=+133.783677867" lastFinishedPulling="2026-04-22 18:44:58.803857062 +0000 UTC m=+135.043489402" observedRunningTime="2026-04-22 18:44:59.556481451 +0000 UTC m=+135.796113809" watchObservedRunningTime="2026-04-22 18:44:59.558064607 +0000 UTC m=+135.797696963" Apr 22 18:44:59.780130 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.780095 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-zqnzm"] Apr 22 18:44:59.781976 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.781955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.784397 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.784377 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:44:59.784507 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.784460 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:44:59.784568 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.784533 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:44:59.785399 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.785383 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h7vq4\"" Apr 22 18:44:59.785488 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.785412 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:44:59.796398 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.796354 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zqnzm"] Apr 22 18:44:59.851995 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.851966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/37f17e4f-8487-4af2-b0cb-77595be064c5-data-volume\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.852125 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.852017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsvjg\" (UniqueName: \"kubernetes.io/projected/37f17e4f-8487-4af2-b0cb-77595be064c5-kube-api-access-jsvjg\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.852125 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.852046 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/37f17e4f-8487-4af2-b0cb-77595be064c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.852125 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.852070 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.852125 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.852098 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/37f17e4f-8487-4af2-b0cb-77595be064c5-crio-socket\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.952424 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.952384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/37f17e4f-8487-4af2-b0cb-77595be064c5-data-volume\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.952531 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.952443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsvjg\" (UniqueName: \"kubernetes.io/projected/37f17e4f-8487-4af2-b0cb-77595be064c5-kube-api-access-jsvjg\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.952568 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.952555 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/37f17e4f-8487-4af2-b0cb-77595be064c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.952600 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.952586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.952653 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.952638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/37f17e4f-8487-4af2-b0cb-77595be064c5-crio-socket\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.952762 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.952738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/37f17e4f-8487-4af2-b0cb-77595be064c5-data-volume\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.952832 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:59.952792 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:44:59.952832 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.952812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/37f17e4f-8487-4af2-b0cb-77595be064c5-crio-socket\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.952890 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:44:59.952850 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls podName:37f17e4f-8487-4af2-b0cb-77595be064c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:00.452833196 +0000 UTC m=+136.692465533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls") pod "insights-runtime-extractor-zqnzm" (UID: "37f17e4f-8487-4af2-b0cb-77595be064c5") : secret "insights-runtime-extractor-tls" not found Apr 22 18:44:59.953034 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.953019 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/37f17e4f-8487-4af2-b0cb-77595be064c5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:44:59.967905 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:44:59.967880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsvjg\" (UniqueName: \"kubernetes.io/projected/37f17e4f-8487-4af2-b0cb-77595be064c5-kube-api-access-jsvjg\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:45:00.456879 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:00.456839 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:45:00.457045 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:00.456985 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:45:00.457094 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:00.457049 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls podName:37f17e4f-8487-4af2-b0cb-77595be064c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:01.457034152 +0000 UTC m=+137.696666489 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls") pod "insights-runtime-extractor-zqnzm" (UID: "37f17e4f-8487-4af2-b0cb-77595be064c5") : secret "insights-runtime-extractor-tls" not found Apr 22 18:45:01.464718 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:01.464680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:45:01.465076 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:01.464832 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:45:01.465076 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:01.464896 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls podName:37f17e4f-8487-4af2-b0cb-77595be064c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:03.464880682 +0000 UTC m=+139.704513022 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls") pod "insights-runtime-extractor-zqnzm" (UID: "37f17e4f-8487-4af2-b0cb-77595be064c5") : secret "insights-runtime-extractor-tls" not found Apr 22 18:45:01.507959 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:01.507922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:45:01.508278 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:01.508265 2575 scope.go:117] "RemoveContainer" containerID="23a3bc3772c366af5063574a97d32ba89cbc82d5f8e8a3f0cfc2037a86abab1e" Apr 22 18:45:01.508430 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:01.508415 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2r8wk_openshift-console-operator(03550605-e0bb-4434-8e90-08b3aecc5a4c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" podUID="03550605-e0bb-4434-8e90-08b3aecc5a4c" Apr 22 18:45:03.481615 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:03.481573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:45:03.482114 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:03.481750 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:45:03.482114 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:03.481871 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls podName:37f17e4f-8487-4af2-b0cb-77595be064c5 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:07.481849865 +0000 UTC m=+143.721482208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls") pod "insights-runtime-extractor-zqnzm" (UID: "37f17e4f-8487-4af2-b0cb-77595be064c5") : secret "insights-runtime-extractor-tls" not found Apr 22 18:45:07.512663 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:07.512620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:45:07.514860 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:07.514841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/37f17e4f-8487-4af2-b0cb-77595be064c5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zqnzm\" (UID: \"37f17e4f-8487-4af2-b0cb-77595be064c5\") " pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:45:07.591051 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:07.591019 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zqnzm" Apr 22 18:45:07.707616 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:07.707590 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zqnzm"] Apr 22 18:45:07.710724 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:45:07.710686 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37f17e4f_8487_4af2_b0cb_77595be064c5.slice/crio-4a2e91690f495ac921ad1d68ce83e665d2a0be8f18c6bbc97bea4dd37ec91164 WatchSource:0}: Error finding container 4a2e91690f495ac921ad1d68ce83e665d2a0be8f18c6bbc97bea4dd37ec91164: Status 404 returned error can't find the container with id 4a2e91690f495ac921ad1d68ce83e665d2a0be8f18c6bbc97bea4dd37ec91164 Apr 22 18:45:08.563630 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:08.563602 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zqnzm" event={"ID":"37f17e4f-8487-4af2-b0cb-77595be064c5","Type":"ContainerStarted","Data":"468b52f23d82d1b5eb2bf2e820a539d0c8fc7f1a5324308cf94271181aa5a066"} Apr 22 18:45:08.563914 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:08.563636 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zqnzm" event={"ID":"37f17e4f-8487-4af2-b0cb-77595be064c5","Type":"ContainerStarted","Data":"4a2e91690f495ac921ad1d68ce83e665d2a0be8f18c6bbc97bea4dd37ec91164"} Apr 22 18:45:09.568705 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:09.568672 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zqnzm" event={"ID":"37f17e4f-8487-4af2-b0cb-77595be064c5","Type":"ContainerStarted","Data":"be0ed36b463f584e5d98dad8e0dccc9452b3567623a9336fc83f27be1b807d39"} Apr 22 18:45:10.573223 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:10.573185 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zqnzm" event={"ID":"37f17e4f-8487-4af2-b0cb-77595be064c5","Type":"ContainerStarted","Data":"a3850eaf4418b8e2dfcb4e50446d98e6c044196677590e4fd01017fde7407809"} Apr 22 18:45:10.593548 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:10.593498 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-zqnzm" podStartSLOduration=9.255165271 podStartE2EDuration="11.593481467s" podCreationTimestamp="2026-04-22 18:44:59 +0000 UTC" firstStartedPulling="2026-04-22 18:45:07.754147371 +0000 UTC m=+143.993779711" lastFinishedPulling="2026-04-22 18:45:10.092463562 +0000 UTC m=+146.332095907" observedRunningTime="2026-04-22 18:45:10.592765558 +0000 UTC m=+146.832397918" watchObservedRunningTime="2026-04-22 18:45:10.593481467 +0000 UTC m=+146.833113827" Apr 22 18:45:14.192495 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.192465 2575 scope.go:117] "RemoveContainer" containerID="23a3bc3772c366af5063574a97d32ba89cbc82d5f8e8a3f0cfc2037a86abab1e" Apr 22 18:45:14.583132 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.583106 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 18:45:14.583476 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.583462 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/1.log" Apr 22 18:45:14.583523 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.583494 2575 generic.go:358] "Generic (PLEG): container finished" podID="03550605-e0bb-4434-8e90-08b3aecc5a4c" containerID="7c28263e9cad9c592b0a35aa4b64fe77dceb739ae0bd1cc6f1e3ad2d680e12c8" exitCode=255 Apr 22 18:45:14.583555 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.583543 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" event={"ID":"03550605-e0bb-4434-8e90-08b3aecc5a4c","Type":"ContainerDied","Data":"7c28263e9cad9c592b0a35aa4b64fe77dceb739ae0bd1cc6f1e3ad2d680e12c8"} Apr 22 18:45:14.583591 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.583568 2575 scope.go:117] "RemoveContainer" containerID="23a3bc3772c366af5063574a97d32ba89cbc82d5f8e8a3f0cfc2037a86abab1e" Apr 22 18:45:14.583976 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.583954 2575 scope.go:117] "RemoveContainer" containerID="7c28263e9cad9c592b0a35aa4b64fe77dceb739ae0bd1cc6f1e3ad2d680e12c8" Apr 22 18:45:14.584162 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:14.584141 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2r8wk_openshift-console-operator(03550605-e0bb-4434-8e90-08b3aecc5a4c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" podUID="03550605-e0bb-4434-8e90-08b3aecc5a4c" Apr 22 18:45:14.671788 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.671753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:45:14.671898 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.671811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:45:14.671898 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.671840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:45:14.672375 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.672352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d296e50b-a805-4e1b-9297-f74fb4549ed5-service-ca-bundle\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:45:14.674077 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.674046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d296e50b-a805-4e1b-9297-f74fb4549ed5-metrics-certs\") pod \"router-default-7cc989c66-cc7nk\" (UID: \"d296e50b-a805-4e1b-9297-f74fb4549ed5\") " pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:45:14.674306 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.674289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/309b4d32-03cc-43ef-b0f0-8f772378a81a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-p7ksl\" (UID: \"309b4d32-03cc-43ef-b0f0-8f772378a81a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:45:14.966493 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.966463 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fdghj\"" Apr 22 18:45:14.970448 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.970428 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-jrz95\"" Apr 22 18:45:14.974407 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.974389 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" Apr 22 18:45:14.979242 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:14.979221 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:45:15.099273 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:15.099244 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7cc989c66-cc7nk"] Apr 22 18:45:15.103046 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:45:15.103023 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd296e50b_a805_4e1b_9297_f74fb4549ed5.slice/crio-eacc229b13c3e81a8ba68b9676fa89e8009ca5a1d885dd384728eb9a6a92474f WatchSource:0}: Error finding container eacc229b13c3e81a8ba68b9676fa89e8009ca5a1d885dd384728eb9a6a92474f: Status 404 returned error can't find the container with id eacc229b13c3e81a8ba68b9676fa89e8009ca5a1d885dd384728eb9a6a92474f Apr 22 18:45:15.112364 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:15.112281 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl"] Apr 22 18:45:15.114636 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:45:15.114610 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod309b4d32_03cc_43ef_b0f0_8f772378a81a.slice/crio-785590dda840f70452f12a131cb803b7a9cdd65b849e6bb37ff67096d1a8c117 WatchSource:0}: Error finding container 785590dda840f70452f12a131cb803b7a9cdd65b849e6bb37ff67096d1a8c117: Status 404 returned error can't find the container with id 785590dda840f70452f12a131cb803b7a9cdd65b849e6bb37ff67096d1a8c117 Apr 22 18:45:15.587412 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:15.587374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7cc989c66-cc7nk" event={"ID":"d296e50b-a805-4e1b-9297-f74fb4549ed5","Type":"ContainerStarted","Data":"5c1d6fc92ef7c5a177d13c44e4a624231cc6e684203d42a40c60fba221538d7f"} Apr 22 18:45:15.587412 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:15.587409 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7cc989c66-cc7nk" event={"ID":"d296e50b-a805-4e1b-9297-f74fb4549ed5","Type":"ContainerStarted","Data":"eacc229b13c3e81a8ba68b9676fa89e8009ca5a1d885dd384728eb9a6a92474f"} Apr 22 18:45:15.588439 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:15.588412 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" event={"ID":"309b4d32-03cc-43ef-b0f0-8f772378a81a","Type":"ContainerStarted","Data":"785590dda840f70452f12a131cb803b7a9cdd65b849e6bb37ff67096d1a8c117"} Apr 22 18:45:15.589640 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:15.589623 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 18:45:15.607389 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:15.607344 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7cc989c66-cc7nk" podStartSLOduration=33.607327411 podStartE2EDuration="33.607327411s" podCreationTimestamp="2026-04-22 18:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:45:15.606630163 +0000 UTC m=+151.846262523" watchObservedRunningTime="2026-04-22 18:45:15.607327411 +0000 UTC m=+151.846959771" Apr 22 18:45:15.979955 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:15.979906 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:45:15.982768 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:15.982745 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:45:16.593041 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:16.593006 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:45:16.594307 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:16.594286 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7cc989c66-cc7nk" Apr 22 18:45:17.595833 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:17.595803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" event={"ID":"309b4d32-03cc-43ef-b0f0-8f772378a81a","Type":"ContainerStarted","Data":"24494094490a3d9f7e4c4e38601ffa5f189af5d4c01f17410f82bce716da5a42"} Apr 22 18:45:17.613356 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:17.613310 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-p7ksl" podStartSLOduration=33.855244933 podStartE2EDuration="35.613294631s" podCreationTimestamp="2026-04-22 18:44:42 +0000 UTC" firstStartedPulling="2026-04-22 18:45:15.116287569 +0000 UTC m=+151.355919912" lastFinishedPulling="2026-04-22 18:45:16.874337273 +0000 UTC m=+153.113969610" observedRunningTime="2026-04-22 18:45:17.612089167 +0000 UTC m=+153.851721523" watchObservedRunningTime="2026-04-22 18:45:17.613294631 +0000 UTC m=+153.852926990" Apr 22 18:45:19.006130 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:19.006095 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:45:19.006612 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:19.006444 2575 scope.go:117] "RemoveContainer" containerID="7c28263e9cad9c592b0a35aa4b64fe77dceb739ae0bd1cc6f1e3ad2d680e12c8" Apr 22 18:45:19.006612 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:19.006600 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2r8wk_openshift-console-operator(03550605-e0bb-4434-8e90-08b3aecc5a4c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" podUID="03550605-e0bb-4434-8e90-08b3aecc5a4c" Apr 22 18:45:19.717343 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:19.717305 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-sn54r" podUID="dcba4051-c58c-4ba8-baba-853741840882" Apr 22 18:45:19.729488 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:19.729463 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-grs9r" podUID="6922ad30-ba0a-4bf8-b384-cdf6a0514c3a" Apr 22 18:45:20.201009 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:20.200919 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7zmbr" podUID="19ace946-23b0-451c-93fa-078938130dd5" Apr 22 18:45:20.276817 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:20.276765 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84"] Apr 22 18:45:20.280071 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:20.280053 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84" Apr 22 18:45:20.286604 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:20.286585 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-7wndm\"" Apr 22 18:45:20.286684 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:20.286621 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 18:45:20.299029 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:20.299009 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84"] Apr 22 18:45:20.315629 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:20.315595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fa4aef80-7b9b-4870-b6c3-3c67622e5979-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6zp84\" (UID: \"fa4aef80-7b9b-4870-b6c3-3c67622e5979\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84" Apr 22 18:45:20.416708 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:20.416677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fa4aef80-7b9b-4870-b6c3-3c67622e5979-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6zp84\" (UID: \"fa4aef80-7b9b-4870-b6c3-3c67622e5979\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84" Apr 22 18:45:20.419005 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:20.418986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fa4aef80-7b9b-4870-b6c3-3c67622e5979-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6zp84\" (UID: \"fa4aef80-7b9b-4870-b6c3-3c67622e5979\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84" Apr 22 18:45:20.589093 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:20.589059 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84" Apr 22 18:45:20.603526 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:20.603500 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:45:20.719130 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:20.719098 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84"] Apr 22 18:45:20.722300 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:45:20.722279 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa4aef80_7b9b_4870_b6c3_3c67622e5979.slice/crio-f42cc2cb7551b240ea451a8d6f530dfd9e739bad2df6d0c94b1e0628c6e876ac WatchSource:0}: Error finding container f42cc2cb7551b240ea451a8d6f530dfd9e739bad2df6d0c94b1e0628c6e876ac: Status 404 returned error can't find the container with id f42cc2cb7551b240ea451a8d6f530dfd9e739bad2df6d0c94b1e0628c6e876ac Apr 22 18:45:21.508622 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:21.508574 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:45:21.509022 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:21.508966 2575 scope.go:117] "RemoveContainer" containerID="7c28263e9cad9c592b0a35aa4b64fe77dceb739ae0bd1cc6f1e3ad2d680e12c8" Apr 22 18:45:21.509135 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:21.509118 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2r8wk_openshift-console-operator(03550605-e0bb-4434-8e90-08b3aecc5a4c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" podUID="03550605-e0bb-4434-8e90-08b3aecc5a4c" Apr 22 18:45:21.606973 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:21.606932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84" event={"ID":"fa4aef80-7b9b-4870-b6c3-3c67622e5979","Type":"ContainerStarted","Data":"f42cc2cb7551b240ea451a8d6f530dfd9e739bad2df6d0c94b1e0628c6e876ac"} Apr 22 18:45:22.610743 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:22.610710 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84" event={"ID":"fa4aef80-7b9b-4870-b6c3-3c67622e5979","Type":"ContainerStarted","Data":"42e9900d0534c16f75f1fa416bdbd7739fbb77c4e0ff6a9437de4df7fb6680ab"} Apr 22 18:45:22.611215 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:22.610913 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84" Apr 22 18:45:22.615231 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:22.615211 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84" Apr 22 18:45:22.633992 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:22.633952 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6zp84" podStartSLOduration=1.295300557 podStartE2EDuration="2.633937108s" podCreationTimestamp="2026-04-22 18:45:20 +0000 UTC" firstStartedPulling="2026-04-22 18:45:20.724575306 +0000 UTC m=+156.964207648" lastFinishedPulling="2026-04-22 18:45:22.063211857 +0000 UTC m=+158.302844199" observedRunningTime="2026-04-22 18:45:22.633396049 +0000 UTC m=+158.873028409" watchObservedRunningTime="2026-04-22 18:45:22.633937108 +0000 UTC m=+158.873569472" Apr 22 18:45:23.523220 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.523146 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kl76v"] Apr 22 18:45:23.526174 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.526158 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.529052 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.529028 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 18:45:23.529330 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.529314 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-v5ztw\"" Apr 22 18:45:23.529946 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.529931 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 18:45:23.530235 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.530215 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:45:23.539904 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.539885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q7mk\" (UniqueName: \"kubernetes.io/projected/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-kube-api-access-6q7mk\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.539996 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.539926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.539996 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.539963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.540068 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.539995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.543122 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.543102 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kl76v"] Apr 22 18:45:23.641138 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.641100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.641138 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.641143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.641658 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.641224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q7mk\" (UniqueName: \"kubernetes.io/projected/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-kube-api-access-6q7mk\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.641658 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.641288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.641956 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.641936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.643479 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.643459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.643584 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.643522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.653805 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.653761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q7mk\" (UniqueName: \"kubernetes.io/projected/ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b-kube-api-access-6q7mk\") pod \"prometheus-operator-5676c8c784-kl76v\" (UID: \"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.834824 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.834718 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" Apr 22 18:45:23.966298 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:23.966267 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kl76v"] Apr 22 18:45:23.969478 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:45:23.969452 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff4bc19f_0abc_4fdf_86c4_bfe2fc933d7b.slice/crio-00cf9a4f3be0e6acc0c02fc09a81abf401a2b4a05fda22a7608cb8c70f01b499 WatchSource:0}: Error finding container 00cf9a4f3be0e6acc0c02fc09a81abf401a2b4a05fda22a7608cb8c70f01b499: Status 404 returned error can't find the container with id 00cf9a4f3be0e6acc0c02fc09a81abf401a2b4a05fda22a7608cb8c70f01b499 Apr 22 18:45:24.616825 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:24.616765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" event={"ID":"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b","Type":"ContainerStarted","Data":"00cf9a4f3be0e6acc0c02fc09a81abf401a2b4a05fda22a7608cb8c70f01b499"} Apr 22 18:45:24.648530 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:24.648495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:45:24.648975 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:24.648824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:45:24.650899 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:24.650872 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6922ad30-ba0a-4bf8-b384-cdf6a0514c3a-metrics-tls\") pod \"dns-default-grs9r\" (UID: \"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a\") " pod="openshift-dns/dns-default-grs9r" Apr 22 18:45:24.651064 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:24.651048 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcba4051-c58c-4ba8-baba-853741840882-cert\") pod \"ingress-canary-sn54r\" (UID: \"dcba4051-c58c-4ba8-baba-853741840882\") " pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:45:24.807901 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:24.807866 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrbrj\"" Apr 22 18:45:24.814511 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:24.814483 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sn54r" Apr 22 18:45:24.956630 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:24.956608 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sn54r"] Apr 22 18:45:24.958741 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:45:24.958717 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcba4051_c58c_4ba8_baba_853741840882.slice/crio-83643c29121663fd749a2ed6a1115c0fc77942aa9d5bb50fdf7df592f53c262c WatchSource:0}: Error finding container 83643c29121663fd749a2ed6a1115c0fc77942aa9d5bb50fdf7df592f53c262c: Status 404 returned error can't find the container with id 83643c29121663fd749a2ed6a1115c0fc77942aa9d5bb50fdf7df592f53c262c Apr 22 18:45:25.619957 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:25.619912 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sn54r" event={"ID":"dcba4051-c58c-4ba8-baba-853741840882","Type":"ContainerStarted","Data":"83643c29121663fd749a2ed6a1115c0fc77942aa9d5bb50fdf7df592f53c262c"} Apr 22 18:45:26.624566 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:26.624518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" event={"ID":"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b","Type":"ContainerStarted","Data":"249f31c831c7e680bdb781a45a90f534340fc70eadf564e5b445174ad9a8ac45"} Apr 22 18:45:26.624566 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:26.624558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" event={"ID":"ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b","Type":"ContainerStarted","Data":"d2cb1c946d6a41eb4a4b8e0a64ff7745ca4e2aa2d9eae8e38205643db6f92bdf"} Apr 22 18:45:26.642564 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:26.642518 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-kl76v" podStartSLOduration=1.6787263129999999 podStartE2EDuration="3.642504304s" podCreationTimestamp="2026-04-22 18:45:23 +0000 UTC" firstStartedPulling="2026-04-22 18:45:23.971247841 +0000 UTC m=+160.210880178" lastFinishedPulling="2026-04-22 18:45:25.935025828 +0000 UTC m=+162.174658169" observedRunningTime="2026-04-22 18:45:26.641369652 +0000 UTC m=+162.881002012" watchObservedRunningTime="2026-04-22 18:45:26.642504304 +0000 UTC m=+162.882136663" Apr 22 18:45:27.630392 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:27.630357 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sn54r" event={"ID":"dcba4051-c58c-4ba8-baba-853741840882","Type":"ContainerStarted","Data":"ed725cffeb147f0e0d09b530d7fd20ee70ffdb0a9305084d0501f227fd8f6d93"} Apr 22 18:45:27.648302 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:27.648239 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sn54r" podStartSLOduration=129.845190754 podStartE2EDuration="2m11.648220855s" podCreationTimestamp="2026-04-22 18:43:16 +0000 UTC" firstStartedPulling="2026-04-22 18:45:24.960575125 +0000 UTC m=+161.200207462" lastFinishedPulling="2026-04-22 18:45:26.763605211 +0000 UTC m=+163.003237563" observedRunningTime="2026-04-22 18:45:27.64723469 +0000 UTC m=+163.886867051" watchObservedRunningTime="2026-04-22 18:45:27.648220855 +0000 UTC m=+163.887853258" Apr 22 18:45:28.897093 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.897061 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rk5lg"] Apr 22 18:45:28.900330 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.900308 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f87fm"] Apr 22 18:45:28.900482 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.900464 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:28.903292 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.903274 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:28.903982 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.903957 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:45:28.904090 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.903994 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:45:28.904090 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.904020 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:45:28.904090 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.903964 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mnhfl\"" Apr 22 18:45:28.906040 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.906024 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:45:28.906277 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.906254 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-tgx7h\"" Apr 22 18:45:28.906457 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.906442 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 18:45:28.906534 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.906517 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 18:45:28.917535 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.917513 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f87fm"] Apr 22 18:45:28.985073 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-metrics-client-ca\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:28.985073 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-accelerators-collector-config\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:28.985279 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-sys\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:28.985279 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985181 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:28.985279 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33f2ccbe-3de8-4405-9e87-201aa5a5b773-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:28.985279 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-wtmp\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:28.985279 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:28.985437 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:28.985437 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985314 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:28.985437 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/33f2ccbe-3de8-4405-9e87-201aa5a5b773-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:28.985437 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxsxk\" (UniqueName: \"kubernetes.io/projected/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-kube-api-access-xxsxk\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:28.985437 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l5sz\" (UniqueName: \"kubernetes.io/projected/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-api-access-9l5sz\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:28.985437 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-root\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:28.985616 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-tls\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:28.985616 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:28.985473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-textfile\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.086466 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086430 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.086466 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086468 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.086755 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.086755 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/33f2ccbe-3de8-4405-9e87-201aa5a5b773-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.086755 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxsxk\" (UniqueName: \"kubernetes.io/projected/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-kube-api-access-xxsxk\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.086755 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l5sz\" (UniqueName: \"kubernetes.io/projected/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-api-access-9l5sz\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.086755 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-root\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.086755 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-tls\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.086755 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-root\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.087149 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086810 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-textfile\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.087149 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-metrics-client-ca\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.087149 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086904 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-accelerators-collector-config\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.087149 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-sys\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.087149 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.086967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.087149 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.087002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33f2ccbe-3de8-4405-9e87-201aa5a5b773-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.087149 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.087034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-wtmp\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.087149 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.087065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/33f2ccbe-3de8-4405-9e87-201aa5a5b773-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.087149 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.087143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-sys\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.087577 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.087199 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-wtmp\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.087577 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.087309 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.087577 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.087472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-metrics-client-ca\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.087694 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.087578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-textfile\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.087694 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.087590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-accelerators-collector-config\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.087986 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.087965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33f2ccbe-3de8-4405-9e87-201aa5a5b773-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.088966 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.088949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-tls\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.089095 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.089072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.090912 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.090892 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.091002 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.090930 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.094762 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.094742 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxsxk\" (UniqueName: \"kubernetes.io/projected/64727fd4-5eae-4fbd-ad64-ec2f5828bfbd-kube-api-access-xxsxk\") pod \"node-exporter-rk5lg\" (UID: \"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd\") " pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.095191 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.095171 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l5sz\" (UniqueName: \"kubernetes.io/projected/33f2ccbe-3de8-4405-9e87-201aa5a5b773-kube-api-access-9l5sz\") pod \"kube-state-metrics-69db897b98-f87fm\" (UID: \"33f2ccbe-3de8-4405-9e87-201aa5a5b773\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.210836 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.210799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rk5lg" Apr 22 18:45:29.215539 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.215516 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" Apr 22 18:45:29.220104 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:45:29.220078 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64727fd4_5eae_4fbd_ad64_ec2f5828bfbd.slice/crio-ae21872f81826eac7e2aef80e7a8809f36e3dff73b67a09735129c8fa1a98c4f WatchSource:0}: Error finding container ae21872f81826eac7e2aef80e7a8809f36e3dff73b67a09735129c8fa1a98c4f: Status 404 returned error can't find the container with id ae21872f81826eac7e2aef80e7a8809f36e3dff73b67a09735129c8fa1a98c4f Apr 22 18:45:29.347959 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.347915 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f87fm"] Apr 22 18:45:29.350860 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:45:29.350825 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f2ccbe_3de8_4405_9e87_201aa5a5b773.slice/crio-52916e9482520b07feb5de6635da067f991da9e9e2e78c5294da13575481ced0 WatchSource:0}: Error finding container 52916e9482520b07feb5de6635da067f991da9e9e2e78c5294da13575481ced0: Status 404 returned error can't find the container with id 52916e9482520b07feb5de6635da067f991da9e9e2e78c5294da13575481ced0 Apr 22 18:45:29.637026 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.636940 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" event={"ID":"33f2ccbe-3de8-4405-9e87-201aa5a5b773","Type":"ContainerStarted","Data":"52916e9482520b07feb5de6635da067f991da9e9e2e78c5294da13575481ced0"} Apr 22 18:45:29.638096 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:29.638070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rk5lg" event={"ID":"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd","Type":"ContainerStarted","Data":"ae21872f81826eac7e2aef80e7a8809f36e3dff73b67a09735129c8fa1a98c4f"} Apr 22 18:45:30.075943 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.075903 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:45:30.079311 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.079288 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.082134 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.082095 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:45:30.082605 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.082506 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:45:30.082605 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.082514 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:45:30.082605 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.082562 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:45:30.082861 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.082844 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:45:30.083143 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.082998 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:45:30.083647 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.083410 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:45:30.083647 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.083444 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:45:30.083647 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.083480 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-t4m9z\"" Apr 22 18:45:30.083647 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.083604 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:45:30.103872 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.103841 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:45:30.198631 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198235 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-volume\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.198631 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198281 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.198631 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9wdf\" (UniqueName: \"kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-kube-api-access-h9wdf\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.198631 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.198631 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198364 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-web-config\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.198631 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-tls-assets\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.198631 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198421 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.198631 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.198631 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.198631 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.198631 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198541 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.199306 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198699 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-out\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.199306 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.198760 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300195 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-out\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300381 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300381 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-volume\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300381 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300381 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9wdf\" (UniqueName: \"kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-kube-api-access-h9wdf\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300619 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300619 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-web-config\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300619 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-tls-assets\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300619 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300619 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300619 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300956 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.300956 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.300693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.302105 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.301744 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.302105 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.301965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.302105 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.302050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.303036 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.303010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-out\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.303335 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.303310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.303842 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.303814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.304276 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.304246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.304793 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.304733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-volume\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.305871 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.305838 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.306185 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.306163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.306185 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.306178 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-web-config\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.307221 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.307203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-tls-assets\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.308889 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.308869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9wdf\" (UniqueName: \"kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-kube-api-access-h9wdf\") pod \"alertmanager-main-0\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.401903 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.401866 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:45:30.541260 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.541226 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:45:30.545393 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:45:30.545359 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19695eb5_0486_498b_993c_b32dc7a7f2ff.slice/crio-5acb8d5b6b513d428e1a6e27bf7af61bd9ef72de851244541cb6308d96e2b60f WatchSource:0}: Error finding container 5acb8d5b6b513d428e1a6e27bf7af61bd9ef72de851244541cb6308d96e2b60f: Status 404 returned error can't find the container with id 5acb8d5b6b513d428e1a6e27bf7af61bd9ef72de851244541cb6308d96e2b60f Apr 22 18:45:30.642246 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.642164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerStarted","Data":"5acb8d5b6b513d428e1a6e27bf7af61bd9ef72de851244541cb6308d96e2b60f"} Apr 22 18:45:30.643900 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.643862 2575 generic.go:358] "Generic (PLEG): container finished" podID="64727fd4-5eae-4fbd-ad64-ec2f5828bfbd" containerID="298cea09b8c33ed5ce642497215186e528e70ef64245c2b129c0d5c44870484d" exitCode=0 Apr 22 18:45:30.644123 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:30.643920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rk5lg" event={"ID":"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd","Type":"ContainerDied","Data":"298cea09b8c33ed5ce642497215186e528e70ef64245c2b129c0d5c44870484d"} Apr 22 18:45:31.190297 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:31.190178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:45:31.648471 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:31.648422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" event={"ID":"33f2ccbe-3de8-4405-9e87-201aa5a5b773","Type":"ContainerStarted","Data":"3d32da9499f053fd21fd95082dc768e1ed6a5c1e3fccf0f63f6166b838683bdc"} Apr 22 18:45:31.648471 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:31.648470 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" event={"ID":"33f2ccbe-3de8-4405-9e87-201aa5a5b773","Type":"ContainerStarted","Data":"acdcfee02f85e07e9da009d5c7e91a6d47ffcb2bebbdea587dad3ed705b7e34f"} Apr 22 18:45:31.648816 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:31.648487 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" event={"ID":"33f2ccbe-3de8-4405-9e87-201aa5a5b773","Type":"ContainerStarted","Data":"e53b61fd5c369fcd3c76b952e7fd15ac7a74a7e74af7bc60bccd40325d595ece"} Apr 22 18:45:31.650548 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:31.650518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rk5lg" event={"ID":"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd","Type":"ContainerStarted","Data":"d6efc5d207130820f37b8af8625b7e02774db655ed2cdee6300c05107b369883"} Apr 22 18:45:31.650548 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:31.650550 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rk5lg" event={"ID":"64727fd4-5eae-4fbd-ad64-ec2f5828bfbd","Type":"ContainerStarted","Data":"a60b8679fdfd442f4812997b3fea747337a7fd06803425889cd1bc0ffcbf1d7b"} Apr 22 18:45:31.668593 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:31.668543 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-f87fm" podStartSLOduration=2.127735837 podStartE2EDuration="3.66852633s" podCreationTimestamp="2026-04-22 18:45:28 +0000 UTC" firstStartedPulling="2026-04-22 18:45:29.352554522 +0000 UTC m=+165.592186863" lastFinishedPulling="2026-04-22 18:45:30.893345008 +0000 UTC m=+167.132977356" observedRunningTime="2026-04-22 18:45:31.667478587 +0000 UTC m=+167.907110946" watchObservedRunningTime="2026-04-22 18:45:31.66852633 +0000 UTC m=+167.908158689" Apr 22 18:45:31.692445 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:31.692384 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rk5lg" podStartSLOduration=2.827227116 podStartE2EDuration="3.692366157s" podCreationTimestamp="2026-04-22 18:45:28 +0000 UTC" firstStartedPulling="2026-04-22 18:45:29.222023399 +0000 UTC m=+165.461655750" lastFinishedPulling="2026-04-22 18:45:30.087162448 +0000 UTC m=+166.326794791" observedRunningTime="2026-04-22 18:45:31.691232041 +0000 UTC m=+167.930864403" watchObservedRunningTime="2026-04-22 18:45:31.692366157 +0000 UTC m=+167.931998545" Apr 22 18:45:32.655246 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:32.655209 2575 generic.go:358] "Generic (PLEG): container finished" podID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerID="24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5" exitCode=0 Apr 22 18:45:32.655635 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:32.655313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerDied","Data":"24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5"} Apr 22 18:45:33.190702 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:33.190662 2575 scope.go:117] "RemoveContainer" containerID="7c28263e9cad9c592b0a35aa4b64fe77dceb739ae0bd1cc6f1e3ad2d680e12c8" Apr 22 18:45:33.190934 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:45:33.190913 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2r8wk_openshift-console-operator(03550605-e0bb-4434-8e90-08b3aecc5a4c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" podUID="03550605-e0bb-4434-8e90-08b3aecc5a4c" Apr 22 18:45:34.663615 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:34.663537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerStarted","Data":"067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b"} Apr 22 18:45:34.663615 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:34.663574 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerStarted","Data":"53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c"} Apr 22 18:45:34.663615 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:34.663584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerStarted","Data":"e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521"} Apr 22 18:45:34.663615 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:34.663592 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerStarted","Data":"3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a"} Apr 22 18:45:34.663615 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:34.663601 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerStarted","Data":"6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25"} Apr 22 18:45:35.162727 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.160946 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:45:35.164260 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.164237 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.168082 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.168010 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:45:35.168366 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.168342 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:45:35.168366 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.168362 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:45:35.168551 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.168399 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:45:35.168718 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.168694 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:45:35.168849 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.168697 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tzpp8\"" Apr 22 18:45:35.168849 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.168701 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:45:35.168849 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.168722 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:45:35.168849 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.168815 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:45:35.169213 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.169195 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:45:35.169283 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.169209 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:45:35.169283 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.169211 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:45:35.169374 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.169212 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-2d81em3pps9af\"" Apr 22 18:45:35.173660 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.173637 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:45:35.175622 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.175605 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:45:35.182447 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.182424 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:45:35.190279 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.190257 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-grs9r" Apr 22 18:45:35.193081 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.193045 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5kx7g\"" Apr 22 18:45:35.201264 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.201240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-grs9r" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.243806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.243860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.243915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.243958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.243989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config-out\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244070 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244171 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244221 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5b6n\" (UniqueName: \"kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-kube-api-access-v5b6n\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-web-config\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244375 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.244880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.246036 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.246036 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.244483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.328767 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.328724 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-grs9r"] Apr 22 18:45:35.331385 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:45:35.331357 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6922ad30_ba0a_4bf8_b384_cdf6a0514c3a.slice/crio-7c879f6be5dea1fa0bbceb038aaf0bbaaa9f20147d4473824d9b6f31760bf6a7 WatchSource:0}: Error finding container 7c879f6be5dea1fa0bbceb038aaf0bbaaa9f20147d4473824d9b6f31760bf6a7: Status 404 returned error can't find the container with id 7c879f6be5dea1fa0bbceb038aaf0bbaaa9f20147d4473824d9b6f31760bf6a7 Apr 22 18:45:35.345510 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345488 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.345636 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345518 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.345636 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.345636 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.345636 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5b6n\" (UniqueName: \"kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-kube-api-access-v5b6n\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.345835 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-web-config\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.345835 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.345835 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.345835 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345812 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.346035 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345836 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.346035 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.346035 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.346035 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.346035 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.345972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.346035 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.346001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config-out\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.346035 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.346035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.346375 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.346068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.346375 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.346102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.346471 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.346384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.346725 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.346698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.350058 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.348555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.350502 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.350468 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.350988 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.350899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-web-config\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.351481 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.351372 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.351481 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.351450 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.352335 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.351748 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.352335 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.352113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.352335 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.352129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.352335 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.352295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.352570 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.352467 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.352698 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.352649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.353757 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.353731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.353917 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.353898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.354209 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.354190 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.354307 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.354225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config-out\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.361684 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.361662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5b6n\" (UniqueName: \"kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-kube-api-access-v5b6n\") pod \"prometheus-k8s-0\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.477011 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.476972 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:35.606963 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.606932 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:45:35.610646 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:45:35.610615 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a3dc2d5_c865_43ec_94b2_fafce091e0da.slice/crio-36fe8a360c1f140235193a9e8abf1aafde05396c68cff214503347126564508b WatchSource:0}: Error finding container 36fe8a360c1f140235193a9e8abf1aafde05396c68cff214503347126564508b: Status 404 returned error can't find the container with id 36fe8a360c1f140235193a9e8abf1aafde05396c68cff214503347126564508b Apr 22 18:45:35.669378 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.669350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerStarted","Data":"47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d"} Apr 22 18:45:35.670691 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.670668 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerStarted","Data":"36fe8a360c1f140235193a9e8abf1aafde05396c68cff214503347126564508b"} Apr 22 18:45:35.671673 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.671645 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-grs9r" event={"ID":"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a","Type":"ContainerStarted","Data":"7c879f6be5dea1fa0bbceb038aaf0bbaaa9f20147d4473824d9b6f31760bf6a7"} Apr 22 18:45:35.703747 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:35.703697 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.347171345 podStartE2EDuration="5.703680592s" podCreationTimestamp="2026-04-22 18:45:30 +0000 UTC" firstStartedPulling="2026-04-22 18:45:30.547906346 +0000 UTC m=+166.787538696" lastFinishedPulling="2026-04-22 18:45:34.904415604 +0000 UTC m=+171.144047943" observedRunningTime="2026-04-22 18:45:35.701909514 +0000 UTC m=+171.941541873" watchObservedRunningTime="2026-04-22 18:45:35.703680592 +0000 UTC m=+171.943312950" Apr 22 18:45:36.679144 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:36.679099 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerID="796fdcd3d2466a9cb8d98323066866c2486e0b24449a0d11d4c65fe3dfa3e6d8" exitCode=0 Apr 22 18:45:36.679558 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:36.679189 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerDied","Data":"796fdcd3d2466a9cb8d98323066866c2486e0b24449a0d11d4c65fe3dfa3e6d8"} Apr 22 18:45:37.683994 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:37.683759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-grs9r" event={"ID":"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a","Type":"ContainerStarted","Data":"b77e858bd79194c843c32b4af54ce8809af8508cbeb213df3e50ec566e8da95f"} Apr 22 18:45:37.683994 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:37.683817 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-grs9r" event={"ID":"6922ad30-ba0a-4bf8-b384-cdf6a0514c3a","Type":"ContainerStarted","Data":"fc9a7dd1f43a5e3dd5d12c57f8046a27c4ed0f831b724d67a75c249fbcfccea5"} Apr 22 18:45:37.683994 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:37.683973 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-grs9r" Apr 22 18:45:37.708400 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:37.708341 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-grs9r" podStartSLOduration=140.139879054 podStartE2EDuration="2m21.708319572s" podCreationTimestamp="2026-04-22 18:43:16 +0000 UTC" firstStartedPulling="2026-04-22 18:45:35.33527027 +0000 UTC m=+171.574902611" lastFinishedPulling="2026-04-22 18:45:36.903710789 +0000 UTC m=+173.143343129" observedRunningTime="2026-04-22 18:45:37.705809722 +0000 UTC m=+173.945442082" watchObservedRunningTime="2026-04-22 18:45:37.708319572 +0000 UTC m=+173.947951932" Apr 22 18:45:39.692383 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:39.692355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerStarted","Data":"53140b11d4360313c934cd744bef70562df0467a412bffadb325f02e886e1522"} Apr 22 18:45:39.692671 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:39.692392 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerStarted","Data":"6724d5ab170fa7166269a0d130940401cf4c9ab9227b19848efd81215825d2e0"} Apr 22 18:45:41.701916 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:41.701824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerStarted","Data":"1adbbd8628957d6b749eb5569f6be3fcb58c6eb1b7422ee4eeb482ac1b9ce089"} Apr 22 18:45:41.701916 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:41.701858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerStarted","Data":"801765d47b09500281cd3151273c02791114d9e2c360dca7273971bea2445e95"} Apr 22 18:45:41.701916 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:41.701869 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerStarted","Data":"f8029498eeeccc900533e4cdb42d03f2e02eca97f3b1f67de9bfe273d75babb1"} Apr 22 18:45:41.701916 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:41.701878 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerStarted","Data":"9f2250fcac015b06488567b895d888b1ca864519b07171364b2de0dac663ca8b"} Apr 22 18:45:41.736820 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:41.736749 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.188715152 podStartE2EDuration="6.736735062s" podCreationTimestamp="2026-04-22 18:45:35 +0000 UTC" firstStartedPulling="2026-04-22 18:45:36.680538835 +0000 UTC m=+172.920171172" lastFinishedPulling="2026-04-22 18:45:41.22855873 +0000 UTC m=+177.468191082" observedRunningTime="2026-04-22 18:45:41.736469578 +0000 UTC m=+177.976101973" watchObservedRunningTime="2026-04-22 18:45:41.736735062 +0000 UTC m=+177.976367423" Apr 22 18:45:45.477471 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:45.477438 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:46.190762 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:46.190731 2575 scope.go:117] "RemoveContainer" containerID="7c28263e9cad9c592b0a35aa4b64fe77dceb739ae0bd1cc6f1e3ad2d680e12c8" Apr 22 18:45:46.719654 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:46.719616 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 18:45:46.720125 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:46.719763 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" event={"ID":"03550605-e0bb-4434-8e90-08b3aecc5a4c","Type":"ContainerStarted","Data":"03d1639fb345de9e0c294e61f102f78c6186864dddb58ed312a75edd1e403f03"} Apr 22 18:45:46.720328 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:46.720306 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:45:46.725749 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:46.725728 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2r8wk" Apr 22 18:45:47.690689 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:47.690661 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-grs9r" Apr 22 18:45:56.913852 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:56.913824 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-grs9r_6922ad30-ba0a-4bf8-b384-cdf6a0514c3a/dns/0.log" Apr 22 18:45:57.125904 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:57.125876 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-grs9r_6922ad30-ba0a-4bf8-b384-cdf6a0514c3a/kube-rbac-proxy/0.log" Apr 22 18:45:58.112651 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:58.112597 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5jr6w_3bf65c2b-0944-4d58-bd8b-923617359ff3/dns-node-resolver/0.log" Apr 22 18:45:58.714237 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:58.714203 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7cc989c66-cc7nk_d296e50b-a805-4e1b-9297-f74fb4549ed5/router/0.log" Apr 22 18:45:59.313479 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:45:59.313446 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-sn54r_dcba4051-c58c-4ba8-baba-853741840882/serve-healthcheck-canary/0.log" Apr 22 18:46:06.778785 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:06.778729 2575 generic.go:358] "Generic (PLEG): container finished" podID="3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb" containerID="79fe45d00e5f51a14c7b489b8a5867deb757ffb11e6a215b782e065609c47b05" exitCode=0 Apr 22 18:46:06.779152 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:06.778803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" event={"ID":"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb","Type":"ContainerDied","Data":"79fe45d00e5f51a14c7b489b8a5867deb757ffb11e6a215b782e065609c47b05"} Apr 22 18:46:06.779152 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:06.779131 2575 scope.go:117] "RemoveContainer" containerID="79fe45d00e5f51a14c7b489b8a5867deb757ffb11e6a215b782e065609c47b05" Apr 22 18:46:07.783230 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:07.783195 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-x4rmq" event={"ID":"3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb","Type":"ContainerStarted","Data":"73d6f27f65e7ae84369cf944a704518e6e39e6ef5f4098af08641b82d073bf2e"} Apr 22 18:46:35.477608 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:35.477568 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:35.492946 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:35.492920 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:35.884398 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:35.884325 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:49.430922 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.430889 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:46:49.431432 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.431287 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="alertmanager" containerID="cri-o://6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25" gracePeriod=120 Apr 22 18:46:49.431432 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.431396 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy-web" containerID="cri-o://e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521" gracePeriod=120 Apr 22 18:46:49.431595 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.431426 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy" containerID="cri-o://53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c" gracePeriod=120 Apr 22 18:46:49.431595 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.431405 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="prom-label-proxy" containerID="cri-o://47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d" gracePeriod=120 Apr 22 18:46:49.431595 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.431458 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="config-reloader" containerID="cri-o://3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a" gracePeriod=120 Apr 22 18:46:49.431595 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.431370 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy-metric" containerID="cri-o://067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b" gracePeriod=120 Apr 22 18:46:49.914041 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.914008 2575 generic.go:358] "Generic (PLEG): container finished" podID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerID="47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d" exitCode=0 Apr 22 18:46:49.914041 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.914033 2575 generic.go:358] "Generic (PLEG): container finished" podID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerID="53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c" exitCode=0 Apr 22 18:46:49.914041 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.914041 2575 generic.go:358] "Generic (PLEG): container finished" podID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerID="3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a" exitCode=0 Apr 22 18:46:49.914041 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.914049 2575 generic.go:358] "Generic (PLEG): container finished" podID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerID="6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25" exitCode=0 Apr 22 18:46:49.914326 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.914081 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerDied","Data":"47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d"} Apr 22 18:46:49.914326 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.914113 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerDied","Data":"53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c"} Apr 22 18:46:49.914326 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.914124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerDied","Data":"3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a"} Apr 22 18:46:49.914326 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:49.914133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerDied","Data":"6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25"} Apr 22 18:46:50.669271 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.669245 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:50.719076 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719048 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-trusted-ca-bundle\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719196 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719085 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-volume\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719196 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719110 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9wdf\" (UniqueName: \"kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-kube-api-access-h9wdf\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719329 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719231 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-main-tls\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719329 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719279 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-web\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719446 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719423 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-tls-assets\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719519 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719487 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-metrics-client-ca\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719519 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719483 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:50.719635 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719523 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-web-config\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719635 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719548 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-metric\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719635 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719588 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-main-db\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719635 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719618 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-cluster-tls-config\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719865 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719653 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719865 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719733 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-out\") pod \"19695eb5-0486-498b-993c-b32dc7a7f2ff\" (UID: \"19695eb5-0486-498b-993c-b32dc7a7f2ff\") " Apr 22 18:46:50.719970 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.719893 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:50.720048 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.720028 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.720107 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.720054 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/19695eb5-0486-498b-993c-b32dc7a7f2ff-metrics-client-ca\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.721067 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.721015 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:46:50.722169 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.722144 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-volume" (OuterVolumeSpecName: "config-volume") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:50.722300 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.722272 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:50.722300 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.722262 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:50.722661 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.722390 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-kube-api-access-h9wdf" (OuterVolumeSpecName: "kube-api-access-h9wdf") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "kube-api-access-h9wdf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:50.723135 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.723103 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:50.723224 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.723138 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:50.723815 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.723759 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-out" (OuterVolumeSpecName: "config-out") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:46:50.724172 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.724146 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:50.727964 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.727520 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:50.734633 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.734604 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-web-config" (OuterVolumeSpecName: "web-config") pod "19695eb5-0486-498b-993c-b32dc7a7f2ff" (UID: "19695eb5-0486-498b-993c-b32dc7a7f2ff"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:50.821097 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.821020 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-volume\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.821097 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.821048 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h9wdf\" (UniqueName: \"kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-kube-api-access-h9wdf\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.821097 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.821058 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-main-tls\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.821097 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.821069 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.821097 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.821079 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19695eb5-0486-498b-993c-b32dc7a7f2ff-tls-assets\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.821097 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.821089 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-web-config\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.821097 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.821097 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.821398 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.821106 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-alertmanager-main-db\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.821398 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.821115 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-cluster-tls-config\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.821398 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.821124 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/19695eb5-0486-498b-993c-b32dc7a7f2ff-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.821398 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.821133 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19695eb5-0486-498b-993c-b32dc7a7f2ff-config-out\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:50.924857 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.924821 2575 generic.go:358] "Generic (PLEG): container finished" podID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerID="067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b" exitCode=0 Apr 22 18:46:50.924857 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.924850 2575 generic.go:358] "Generic (PLEG): container finished" podID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerID="e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521" exitCode=0 Apr 22 18:46:50.925039 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.924891 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerDied","Data":"067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b"} Apr 22 18:46:50.925039 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.924923 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerDied","Data":"e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521"} Apr 22 18:46:50.925039 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.924936 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"19695eb5-0486-498b-993c-b32dc7a7f2ff","Type":"ContainerDied","Data":"5acb8d5b6b513d428e1a6e27bf7af61bd9ef72de851244541cb6308d96e2b60f"} Apr 22 18:46:50.925039 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.924939 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:50.925039 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.924950 2575 scope.go:117] "RemoveContainer" containerID="47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d" Apr 22 18:46:50.932893 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.932871 2575 scope.go:117] "RemoveContainer" containerID="067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b" Apr 22 18:46:50.939412 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.939397 2575 scope.go:117] "RemoveContainer" containerID="53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c" Apr 22 18:46:50.946371 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.946352 2575 scope.go:117] "RemoveContainer" containerID="e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521" Apr 22 18:46:50.952237 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.952220 2575 scope.go:117] "RemoveContainer" containerID="3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a" Apr 22 18:46:50.956462 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.956443 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:46:50.958967 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.958954 2575 scope.go:117] "RemoveContainer" containerID="6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25" Apr 22 18:46:50.966235 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.966141 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:46:50.966280 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.966261 2575 scope.go:117] "RemoveContainer" containerID="24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5" Apr 22 18:46:50.972666 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.972651 2575 scope.go:117] "RemoveContainer" containerID="47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d" Apr 22 18:46:50.972919 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:46:50.972898 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d\": container with ID starting with 47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d not found: ID does not exist" containerID="47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d" Apr 22 18:46:50.972980 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.972927 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d"} err="failed to get container status \"47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d\": rpc error: code = NotFound desc = could not find container \"47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d\": container with ID starting with 47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d not found: ID does not exist" Apr 22 18:46:50.972980 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.972974 2575 scope.go:117] "RemoveContainer" containerID="067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b" Apr 22 18:46:50.973208 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:46:50.973189 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b\": container with ID starting with 067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b not found: ID does not exist" containerID="067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b" Apr 22 18:46:50.973247 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.973215 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b"} err="failed to get container status \"067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b\": rpc error: code = NotFound desc = could not find container \"067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b\": container with ID starting with 067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b not found: ID does not exist" Apr 22 18:46:50.973247 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.973231 2575 scope.go:117] "RemoveContainer" containerID="53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c" Apr 22 18:46:50.973429 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:46:50.973410 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c\": container with ID starting with 53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c not found: ID does not exist" containerID="53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c" Apr 22 18:46:50.973465 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.973433 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c"} err="failed to get container status \"53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c\": rpc error: code = NotFound desc = could not find container \"53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c\": container with ID starting with 53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c not found: ID does not exist" Apr 22 18:46:50.973465 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.973445 2575 scope.go:117] "RemoveContainer" containerID="e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521" Apr 22 18:46:50.973622 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:46:50.973609 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521\": container with ID starting with e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521 not found: ID does not exist" containerID="e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521" Apr 22 18:46:50.973663 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.973624 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521"} err="failed to get container status \"e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521\": rpc error: code = NotFound desc = could not find container \"e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521\": container with ID starting with e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521 not found: ID does not exist" Apr 22 18:46:50.973663 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.973634 2575 scope.go:117] "RemoveContainer" containerID="3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a" Apr 22 18:46:50.973917 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:46:50.973900 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a\": container with ID starting with 3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a not found: ID does not exist" containerID="3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a" Apr 22 18:46:50.973977 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.973920 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a"} err="failed to get container status \"3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a\": rpc error: code = NotFound desc = could not find container \"3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a\": container with ID starting with 3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a not found: ID does not exist" Apr 22 18:46:50.973977 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.973934 2575 scope.go:117] "RemoveContainer" containerID="6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25" Apr 22 18:46:50.974161 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:46:50.974146 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25\": container with ID starting with 6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25 not found: ID does not exist" containerID="6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25" Apr 22 18:46:50.974200 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.974164 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25"} err="failed to get container status \"6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25\": rpc error: code = NotFound desc = could not find container \"6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25\": container with ID starting with 6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25 not found: ID does not exist" Apr 22 18:46:50.974200 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.974186 2575 scope.go:117] "RemoveContainer" containerID="24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5" Apr 22 18:46:50.974396 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:46:50.974381 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5\": container with ID starting with 24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5 not found: ID does not exist" containerID="24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5" Apr 22 18:46:50.974431 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.974398 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5"} err="failed to get container status \"24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5\": rpc error: code = NotFound desc = could not find container \"24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5\": container with ID starting with 24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5 not found: ID does not exist" Apr 22 18:46:50.974431 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.974410 2575 scope.go:117] "RemoveContainer" containerID="47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d" Apr 22 18:46:50.974604 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.974588 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d"} err="failed to get container status \"47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d\": rpc error: code = NotFound desc = could not find container \"47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d\": container with ID starting with 47ffe3f929b596cefdbe7007df57c143c0c32e6a5f761c6fb506ac040ab9f37d not found: ID does not exist" Apr 22 18:46:50.974604 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.974603 2575 scope.go:117] "RemoveContainer" containerID="067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b" Apr 22 18:46:50.974847 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.974829 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b"} err="failed to get container status \"067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b\": rpc error: code = NotFound desc = could not find container \"067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b\": container with ID starting with 067252ad673ac89a745ccdc477775baf34e605e9e2bbb5a35f703a7ea05bbb7b not found: ID does not exist" Apr 22 18:46:50.974899 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.974848 2575 scope.go:117] "RemoveContainer" containerID="53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c" Apr 22 18:46:50.975046 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.975030 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c"} err="failed to get container status \"53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c\": rpc error: code = NotFound desc = could not find container \"53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c\": container with ID starting with 53a5bf873f7203725ef320f89c669c05db05f9df3f5c4e3e1a8630b3ec312e1c not found: ID does not exist" Apr 22 18:46:50.975094 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.975047 2575 scope.go:117] "RemoveContainer" containerID="e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521" Apr 22 18:46:50.975240 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.975225 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521"} err="failed to get container status \"e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521\": rpc error: code = NotFound desc = could not find container \"e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521\": container with ID starting with e4f8b7f8e62b0cb752ec71721e27e293240f574f1a85069cf6016d71f040c521 not found: ID does not exist" Apr 22 18:46:50.975240 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.975239 2575 scope.go:117] "RemoveContainer" containerID="3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a" Apr 22 18:46:50.975394 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.975380 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a"} err="failed to get container status \"3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a\": rpc error: code = NotFound desc = could not find container \"3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a\": container with ID starting with 3d4a130a865ece4cdb2bb7435b86960ffe6d075744f250ec79caf74b59bbe25a not found: ID does not exist" Apr 22 18:46:50.975435 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.975394 2575 scope.go:117] "RemoveContainer" containerID="6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25" Apr 22 18:46:50.975592 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.975576 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25"} err="failed to get container status \"6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25\": rpc error: code = NotFound desc = could not find container \"6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25\": container with ID starting with 6a6e2530c1b4ed92b1095c134e27676a2fe444212740efbb0259ccc76ce8bc25 not found: ID does not exist" Apr 22 18:46:50.975634 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.975593 2575 scope.go:117] "RemoveContainer" containerID="24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5" Apr 22 18:46:50.975750 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:50.975736 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5"} err="failed to get container status \"24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5\": rpc error: code = NotFound desc = could not find container \"24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5\": container with ID starting with 24bc8e0e2eca13875446b0efd11d1633f9cb9ea8d116dc6185335ce6f63b27c5 not found: ID does not exist" Apr 22 18:46:52.195342 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:52.195306 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" path="/var/lib/kubelet/pods/19695eb5-0486-498b-993c-b32dc7a7f2ff/volumes" Apr 22 18:46:53.698581 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.698499 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:46:53.699675 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.699639 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="prometheus" containerID="cri-o://6724d5ab170fa7166269a0d130940401cf4c9ab9227b19848efd81215825d2e0" gracePeriod=600 Apr 22 18:46:53.700421 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.699847 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy-thanos" containerID="cri-o://1adbbd8628957d6b749eb5569f6be3fcb58c6eb1b7422ee4eeb482ac1b9ce089" gracePeriod=600 Apr 22 18:46:53.700567 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.700060 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy-web" containerID="cri-o://f8029498eeeccc900533e4cdb42d03f2e02eca97f3b1f67de9bfe273d75babb1" gracePeriod=600 Apr 22 18:46:53.700567 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.700081 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy" containerID="cri-o://801765d47b09500281cd3151273c02791114d9e2c360dca7273971bea2445e95" gracePeriod=600 Apr 22 18:46:53.700567 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.700097 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="thanos-sidecar" containerID="cri-o://9f2250fcac015b06488567b895d888b1ca864519b07171364b2de0dac663ca8b" gracePeriod=600 Apr 22 18:46:53.700567 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.700113 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="config-reloader" containerID="cri-o://53140b11d4360313c934cd744bef70562df0467a412bffadb325f02e886e1522" gracePeriod=600 Apr 22 18:46:53.938451 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938402 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerID="1adbbd8628957d6b749eb5569f6be3fcb58c6eb1b7422ee4eeb482ac1b9ce089" exitCode=0 Apr 22 18:46:53.938451 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938431 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerID="801765d47b09500281cd3151273c02791114d9e2c360dca7273971bea2445e95" exitCode=0 Apr 22 18:46:53.938451 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938439 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerID="f8029498eeeccc900533e4cdb42d03f2e02eca97f3b1f67de9bfe273d75babb1" exitCode=0 Apr 22 18:46:53.938451 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938447 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerID="9f2250fcac015b06488567b895d888b1ca864519b07171364b2de0dac663ca8b" exitCode=0 Apr 22 18:46:53.938451 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938455 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerID="53140b11d4360313c934cd744bef70562df0467a412bffadb325f02e886e1522" exitCode=0 Apr 22 18:46:53.938451 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938462 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerID="6724d5ab170fa7166269a0d130940401cf4c9ab9227b19848efd81215825d2e0" exitCode=0 Apr 22 18:46:53.938915 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938465 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerDied","Data":"1adbbd8628957d6b749eb5569f6be3fcb58c6eb1b7422ee4eeb482ac1b9ce089"} Apr 22 18:46:53.938915 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938498 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerDied","Data":"801765d47b09500281cd3151273c02791114d9e2c360dca7273971bea2445e95"} Apr 22 18:46:53.938915 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938508 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerDied","Data":"f8029498eeeccc900533e4cdb42d03f2e02eca97f3b1f67de9bfe273d75babb1"} Apr 22 18:46:53.938915 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938521 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerDied","Data":"9f2250fcac015b06488567b895d888b1ca864519b07171364b2de0dac663ca8b"} Apr 22 18:46:53.938915 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938536 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerDied","Data":"53140b11d4360313c934cd744bef70562df0467a412bffadb325f02e886e1522"} Apr 22 18:46:53.938915 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.938549 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerDied","Data":"6724d5ab170fa7166269a0d130940401cf4c9ab9227b19848efd81215825d2e0"} Apr 22 18:46:53.949288 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:53.949240 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:54.047693 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.047660 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-grpc-tls\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.047895 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.047704 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config-out\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.047895 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.047727 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-trusted-ca-bundle\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.047895 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.047762 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-kube-rbac-proxy\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.047895 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.047883 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048122 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.047924 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-kubelet-serving-ca-bundle\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048122 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.047950 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-web-config\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048220 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048161 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:54.048268 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048217 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-serving-certs-ca-bundle\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048268 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048254 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-thanos-prometheus-http-client-file\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048363 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048256 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:54.048363 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048282 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-tls\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048363 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048312 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048363 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048345 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5b6n\" (UniqueName: \"kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-kube-api-access-v5b6n\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048553 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048377 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-db\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048553 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048417 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-rulefiles-0\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048553 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048447 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048553 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048477 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-tls-assets\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048553 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048501 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:54.048553 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048506 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-metrics-client-ca\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048876 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048556 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-metrics-client-certs\") pod \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\" (UID: \"1a3dc2d5-c865-43ec-94b2-fafce091e0da\") " Apr 22 18:46:54.048876 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048848 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:54.048876 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048853 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-trusted-ca-bundle\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.049035 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048890 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.049035 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.048906 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.050852 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.050561 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.050852 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.050628 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.050852 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.050713 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:54.050852 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.050814 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config-out" (OuterVolumeSpecName: "config-out") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:46:54.051116 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.050883 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:46:54.052609 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.052560 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.052729 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.052606 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.052890 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.052851 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.053027 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.052921 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config" (OuterVolumeSpecName: "config") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.053396 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.053337 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-kube-api-access-v5b6n" (OuterVolumeSpecName: "kube-api-access-v5b6n") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "kube-api-access-v5b6n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:54.053500 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.053477 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.053666 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.053641 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:54.054223 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.054194 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.062589 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.062568 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-web-config" (OuterVolumeSpecName: "web-config") pod "1a3dc2d5-c865-43ec-94b2-fafce091e0da" (UID: "1a3dc2d5-c865-43ec-94b2-fafce091e0da"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:54.149216 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149177 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-tls-assets\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149216 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149214 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-configmap-metrics-client-ca\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149216 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149227 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-metrics-client-certs\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149236 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-grpc-tls\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149245 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config-out\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149255 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-kube-rbac-proxy\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149264 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-config\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149272 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-web-config\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149281 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-thanos-prometheus-http-client-file\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149289 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-tls\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149300 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149309 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5b6n\" (UniqueName: \"kubernetes.io/projected/1a3dc2d5-c865-43ec-94b2-fafce091e0da-kube-api-access-v5b6n\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149319 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-db\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149329 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1a3dc2d5-c865-43ec-94b2-fafce091e0da-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.149408 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.149338 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1a3dc2d5-c865-43ec-94b2-fafce091e0da-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:46:54.944133 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.944090 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1a3dc2d5-c865-43ec-94b2-fafce091e0da","Type":"ContainerDied","Data":"36fe8a360c1f140235193a9e8abf1aafde05396c68cff214503347126564508b"} Apr 22 18:46:54.944512 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.944149 2575 scope.go:117] "RemoveContainer" containerID="1adbbd8628957d6b749eb5569f6be3fcb58c6eb1b7422ee4eeb482ac1b9ce089" Apr 22 18:46:54.944512 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.944223 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:54.951883 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.951854 2575 scope.go:117] "RemoveContainer" containerID="801765d47b09500281cd3151273c02791114d9e2c360dca7273971bea2445e95" Apr 22 18:46:54.958311 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.958293 2575 scope.go:117] "RemoveContainer" containerID="f8029498eeeccc900533e4cdb42d03f2e02eca97f3b1f67de9bfe273d75babb1" Apr 22 18:46:54.964263 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.964247 2575 scope.go:117] "RemoveContainer" containerID="9f2250fcac015b06488567b895d888b1ca864519b07171364b2de0dac663ca8b" Apr 22 18:46:54.968241 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.968218 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:46:54.970910 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.970893 2575 scope.go:117] "RemoveContainer" containerID="53140b11d4360313c934cd744bef70562df0467a412bffadb325f02e886e1522" Apr 22 18:46:54.974578 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.974557 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:46:54.977110 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.977092 2575 scope.go:117] "RemoveContainer" containerID="6724d5ab170fa7166269a0d130940401cf4c9ab9227b19848efd81215825d2e0" Apr 22 18:46:54.983676 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:54.983661 2575 scope.go:117] "RemoveContainer" containerID="796fdcd3d2466a9cb8d98323066866c2486e0b24449a0d11d4c65fe3dfa3e6d8" Apr 22 18:46:55.002614 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002588 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:46:55.002876 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002864 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="thanos-sidecar" Apr 22 18:46:55.002928 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002879 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="thanos-sidecar" Apr 22 18:46:55.002928 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002890 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="init-config-reloader" Apr 22 18:46:55.002928 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002895 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="init-config-reloader" Apr 22 18:46:55.002928 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002901 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="prom-label-proxy" Apr 22 18:46:55.002928 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002907 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="prom-label-proxy" Apr 22 18:46:55.002928 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002914 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy-thanos" Apr 22 18:46:55.002928 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002919 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy-thanos" Apr 22 18:46:55.002928 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002926 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="config-reloader" Apr 22 18:46:55.002928 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002931 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="config-reloader" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002937 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy-web" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002943 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy-web" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002949 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002954 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002961 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="alertmanager" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002966 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="alertmanager" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002973 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="prometheus" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002978 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="prometheus" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002987 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy-metric" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.002993 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy-metric" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003001 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy-web" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003006 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy-web" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003016 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="config-reloader" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003021 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="config-reloader" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003026 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003030 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003036 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="init-config-reloader" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003041 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="init-config-reloader" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003077 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy-web" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003085 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy-metric" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003091 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="config-reloader" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003096 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="prometheus" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003102 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="alertmanager" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003108 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="thanos-sidecar" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003113 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003119 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy-web" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003125 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="kube-rbac-proxy-thanos" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003130 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="prom-label-proxy" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003136 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" containerName="config-reloader" Apr 22 18:46:55.003198 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.003142 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="19695eb5-0486-498b-993c-b32dc7a7f2ff" containerName="kube-rbac-proxy" Apr 22 18:46:55.008246 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.008229 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.010583 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.010566 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-2d81em3pps9af\"" Apr 22 18:46:55.010960 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.010946 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:46:55.011147 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.011134 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:46:55.011230 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.011157 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:46:55.011299 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.011255 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:46:55.011374 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.011359 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:46:55.011564 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.011550 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:46:55.011644 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.011624 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:46:55.011709 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.011654 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:46:55.011709 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.011627 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:46:55.011709 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.011690 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:46:55.012396 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.012377 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tzpp8\"" Apr 22 18:46:55.012717 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.012701 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:46:55.014767 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.014751 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:46:55.017679 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.017663 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:46:55.025401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.025382 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:46:55.056328 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056304 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056448 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf1faafb-76d6-42b7-bb43-29c6de68a436-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056448 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056448 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056552 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kt7n\" (UniqueName: \"kubernetes.io/projected/cf1faafb-76d6-42b7-bb43-29c6de68a436-kube-api-access-6kt7n\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056552 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056552 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056526 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-web-config\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056552 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cf1faafb-76d6-42b7-bb43-29c6de68a436-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056669 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056669 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056669 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf1faafb-76d6-42b7-bb43-29c6de68a436-config-out\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056756 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056756 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056695 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056756 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056720 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-config\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056875 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056875 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056875 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.056875 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.056843 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158020 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.157989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158020 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158193 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158043 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158193 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158285 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf1faafb-76d6-42b7-bb43-29c6de68a436-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158285 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158285 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158428 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kt7n\" (UniqueName: \"kubernetes.io/projected/cf1faafb-76d6-42b7-bb43-29c6de68a436-kube-api-access-6kt7n\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158428 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158428 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-web-config\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158428 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cf1faafb-76d6-42b7-bb43-29c6de68a436-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158644 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158644 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158644 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158517 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf1faafb-76d6-42b7-bb43-29c6de68a436-config-out\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158644 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158644 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158644 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-config\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.158644 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.158625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.161814 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.160849 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf1faafb-76d6-42b7-bb43-29c6de68a436-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.161814 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.161283 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.161814 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.161316 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.161814 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.161392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.161814 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.161401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.161814 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.161520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.162172 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.161928 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cf1faafb-76d6-42b7-bb43-29c6de68a436-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.162172 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.161996 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.162172 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.162033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.162319 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.162261 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.162742 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.162716 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.162971 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.162948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf1faafb-76d6-42b7-bb43-29c6de68a436-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.163320 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.163299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.164088 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.164068 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.164515 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.164493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf1faafb-76d6-42b7-bb43-29c6de68a436-config-out\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.164635 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.164621 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-web-config\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.164673 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.164648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf1faafb-76d6-42b7-bb43-29c6de68a436-config\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.170912 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.170893 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kt7n\" (UniqueName: \"kubernetes.io/projected/cf1faafb-76d6-42b7-bb43-29c6de68a436-kube-api-access-6kt7n\") pod \"prometheus-k8s-0\" (UID: \"cf1faafb-76d6-42b7-bb43-29c6de68a436\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.319346 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.319245 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:55.473580 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.473551 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:46:55.474304 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:46:55.474281 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf1faafb_76d6_42b7_bb43_29c6de68a436.slice/crio-d28dbdb7bbc2809f759313165060e0241ee5fe9c318a044d7af49bc02b433f8f WatchSource:0}: Error finding container d28dbdb7bbc2809f759313165060e0241ee5fe9c318a044d7af49bc02b433f8f: Status 404 returned error can't find the container with id d28dbdb7bbc2809f759313165060e0241ee5fe9c318a044d7af49bc02b433f8f Apr 22 18:46:55.948388 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.948347 2575 generic.go:358] "Generic (PLEG): container finished" podID="cf1faafb-76d6-42b7-bb43-29c6de68a436" containerID="1eb567ed61a6deac47de597778f851c078a4a09f904edebade19ce738b9ba502" exitCode=0 Apr 22 18:46:55.948834 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.948410 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cf1faafb-76d6-42b7-bb43-29c6de68a436","Type":"ContainerDied","Data":"1eb567ed61a6deac47de597778f851c078a4a09f904edebade19ce738b9ba502"} Apr 22 18:46:55.948834 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:55.948450 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cf1faafb-76d6-42b7-bb43-29c6de68a436","Type":"ContainerStarted","Data":"d28dbdb7bbc2809f759313165060e0241ee5fe9c318a044d7af49bc02b433f8f"} Apr 22 18:46:56.066113 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.066084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:46:56.068442 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.068417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19ace946-23b0-451c-93fa-078938130dd5-metrics-certs\") pod \"network-metrics-daemon-7zmbr\" (UID: \"19ace946-23b0-451c-93fa-078938130dd5\") " pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:46:56.094330 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.094311 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7gnxr\"" Apr 22 18:46:56.101829 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.101803 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7zmbr" Apr 22 18:46:56.201152 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.200999 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a3dc2d5-c865-43ec-94b2-fafce091e0da" path="/var/lib/kubelet/pods/1a3dc2d5-c865-43ec-94b2-fafce091e0da/volumes" Apr 22 18:46:56.265837 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.265805 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7zmbr"] Apr 22 18:46:56.273759 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:46:56.273730 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19ace946_23b0_451c_93fa_078938130dd5.slice/crio-048a447f62810d6cdfed38666c7c4079f7cc412137eaeda74aebee0b787c7b37 WatchSource:0}: Error finding container 048a447f62810d6cdfed38666c7c4079f7cc412137eaeda74aebee0b787c7b37: Status 404 returned error can't find the container with id 048a447f62810d6cdfed38666c7c4079f7cc412137eaeda74aebee0b787c7b37 Apr 22 18:46:56.953679 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.953632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7zmbr" event={"ID":"19ace946-23b0-451c-93fa-078938130dd5","Type":"ContainerStarted","Data":"048a447f62810d6cdfed38666c7c4079f7cc412137eaeda74aebee0b787c7b37"} Apr 22 18:46:56.956739 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.956712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cf1faafb-76d6-42b7-bb43-29c6de68a436","Type":"ContainerStarted","Data":"81d159cb0ee33b6bbf4914ba3c305c3eeedc56719416b2bb3b68c6396c907ecc"} Apr 22 18:46:56.956883 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.956746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cf1faafb-76d6-42b7-bb43-29c6de68a436","Type":"ContainerStarted","Data":"55c54b083c3464f2fff65feb8b0cd8089c1ebac5a255d8c0a83db908c14376e3"} Apr 22 18:46:56.956883 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.956759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cf1faafb-76d6-42b7-bb43-29c6de68a436","Type":"ContainerStarted","Data":"647906bcbf1951c65b3e3c1bd9f9117dbcf1d29e6df4c7f4a3a48732bfff45be"} Apr 22 18:46:56.956883 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.956791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cf1faafb-76d6-42b7-bb43-29c6de68a436","Type":"ContainerStarted","Data":"8a78ee0a1aafb2cf97b8df245163afebd69c553ca5a37cc2f71ce3474c87251b"} Apr 22 18:46:56.956883 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.956805 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cf1faafb-76d6-42b7-bb43-29c6de68a436","Type":"ContainerStarted","Data":"38ae190a92eaedf3ade756562bb459f72dd784d9004f22ed992e7a668fa42f4a"} Apr 22 18:46:56.956883 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.956817 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cf1faafb-76d6-42b7-bb43-29c6de68a436","Type":"ContainerStarted","Data":"74a9fac7ba2e83eb9ca2d09504a5c99ff66fb14fa62baa5316720c1780269647"} Apr 22 18:46:56.993055 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:56.992967 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.99295134 podStartE2EDuration="2.99295134s" podCreationTimestamp="2026-04-22 18:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:56.992046637 +0000 UTC m=+253.231678997" watchObservedRunningTime="2026-04-22 18:46:56.99295134 +0000 UTC m=+253.232583728" Apr 22 18:46:57.961824 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:57.961768 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7zmbr" event={"ID":"19ace946-23b0-451c-93fa-078938130dd5","Type":"ContainerStarted","Data":"d58f6996f1fc64ab9b0d337e2e7d0cbc3c429a62c11a6be5b2d9ae661dfe1ccc"} Apr 22 18:46:57.961824 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:57.961822 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7zmbr" event={"ID":"19ace946-23b0-451c-93fa-078938130dd5","Type":"ContainerStarted","Data":"15a9a6825dce7c680a16391492e77f985034b1f64ada4a5cb37a2044bcb879d6"} Apr 22 18:46:57.981308 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:46:57.981252 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7zmbr" podStartSLOduration=252.981634974 podStartE2EDuration="4m13.981236147s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:46:56.275664202 +0000 UTC m=+252.515296539" lastFinishedPulling="2026-04-22 18:46:57.275265374 +0000 UTC m=+253.514897712" observedRunningTime="2026-04-22 18:46:57.979407467 +0000 UTC m=+254.219039826" watchObservedRunningTime="2026-04-22 18:46:57.981236147 +0000 UTC m=+254.220868505" Apr 22 18:47:00.319561 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:47:00.319531 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:44.098584 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:47:44.098548 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 18:47:44.098584 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:47:44.098554 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 18:47:44.105701 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:47:44.105685 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 18:47:44.105781 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:47:44.105696 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 18:47:44.108904 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:47:44.108889 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:47:55.320252 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:47:55.320209 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:55.335341 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:47:55.335314 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:56.139294 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:47:56.139267 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:05.291517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.291483 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-58wc6"] Apr 22 18:50:05.294989 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.294965 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-58wc6" Apr 22 18:50:05.297406 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.297388 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:50:05.315542 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.315520 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-58wc6"] Apr 22 18:50:05.376967 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.376933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f50f3098-de64-4067-a39e-e51151f6082a-kubelet-config\") pod \"global-pull-secret-syncer-58wc6\" (UID: \"f50f3098-de64-4067-a39e-e51151f6082a\") " pod="kube-system/global-pull-secret-syncer-58wc6" Apr 22 18:50:05.376967 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.376973 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f50f3098-de64-4067-a39e-e51151f6082a-dbus\") pod \"global-pull-secret-syncer-58wc6\" (UID: \"f50f3098-de64-4067-a39e-e51151f6082a\") " pod="kube-system/global-pull-secret-syncer-58wc6" Apr 22 18:50:05.377156 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.376996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f50f3098-de64-4067-a39e-e51151f6082a-original-pull-secret\") pod \"global-pull-secret-syncer-58wc6\" (UID: \"f50f3098-de64-4067-a39e-e51151f6082a\") " pod="kube-system/global-pull-secret-syncer-58wc6" Apr 22 18:50:05.478066 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.478028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f50f3098-de64-4067-a39e-e51151f6082a-kubelet-config\") pod \"global-pull-secret-syncer-58wc6\" (UID: \"f50f3098-de64-4067-a39e-e51151f6082a\") " pod="kube-system/global-pull-secret-syncer-58wc6" Apr 22 18:50:05.478066 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.478065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f50f3098-de64-4067-a39e-e51151f6082a-dbus\") pod \"global-pull-secret-syncer-58wc6\" (UID: \"f50f3098-de64-4067-a39e-e51151f6082a\") " pod="kube-system/global-pull-secret-syncer-58wc6" Apr 22 18:50:05.478259 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.478086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f50f3098-de64-4067-a39e-e51151f6082a-original-pull-secret\") pod \"global-pull-secret-syncer-58wc6\" (UID: \"f50f3098-de64-4067-a39e-e51151f6082a\") " pod="kube-system/global-pull-secret-syncer-58wc6" Apr 22 18:50:05.478259 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.478160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f50f3098-de64-4067-a39e-e51151f6082a-kubelet-config\") pod \"global-pull-secret-syncer-58wc6\" (UID: \"f50f3098-de64-4067-a39e-e51151f6082a\") " pod="kube-system/global-pull-secret-syncer-58wc6" Apr 22 18:50:05.478259 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.478215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f50f3098-de64-4067-a39e-e51151f6082a-dbus\") pod \"global-pull-secret-syncer-58wc6\" (UID: \"f50f3098-de64-4067-a39e-e51151f6082a\") " pod="kube-system/global-pull-secret-syncer-58wc6" Apr 22 18:50:05.480320 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.480299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f50f3098-de64-4067-a39e-e51151f6082a-original-pull-secret\") pod \"global-pull-secret-syncer-58wc6\" (UID: \"f50f3098-de64-4067-a39e-e51151f6082a\") " pod="kube-system/global-pull-secret-syncer-58wc6" Apr 22 18:50:05.604117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.604014 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-58wc6" Apr 22 18:50:05.738669 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.738646 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-58wc6"] Apr 22 18:50:05.740869 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:50:05.740836 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf50f3098_de64_4067_a39e_e51151f6082a.slice/crio-1fedf1fcee4a22c461a566da2adc5f7af84b81c8da7fb5088dd1469e612f97e8 WatchSource:0}: Error finding container 1fedf1fcee4a22c461a566da2adc5f7af84b81c8da7fb5088dd1469e612f97e8: Status 404 returned error can't find the container with id 1fedf1fcee4a22c461a566da2adc5f7af84b81c8da7fb5088dd1469e612f97e8 Apr 22 18:50:05.742660 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:05.742644 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:50:06.479916 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:06.479880 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-58wc6" event={"ID":"f50f3098-de64-4067-a39e-e51151f6082a","Type":"ContainerStarted","Data":"1fedf1fcee4a22c461a566da2adc5f7af84b81c8da7fb5088dd1469e612f97e8"} Apr 22 18:50:10.492440 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:10.492406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-58wc6" event={"ID":"f50f3098-de64-4067-a39e-e51151f6082a","Type":"ContainerStarted","Data":"bfb67b2c18d4d3eaa954170e708821ec781864bda5b7a7812c98b17087d27c5c"} Apr 22 18:50:10.509831 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:50:10.509759 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-58wc6" podStartSLOduration=1.807301329 podStartE2EDuration="5.50974643s" podCreationTimestamp="2026-04-22 18:50:05 +0000 UTC" firstStartedPulling="2026-04-22 18:50:05.742792894 +0000 UTC m=+441.982425231" lastFinishedPulling="2026-04-22 18:50:09.445237993 +0000 UTC m=+445.684870332" observedRunningTime="2026-04-22 18:50:10.508109296 +0000 UTC m=+446.747741656" watchObservedRunningTime="2026-04-22 18:50:10.50974643 +0000 UTC m=+446.749378788" Apr 22 18:51:24.534920 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.534840 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-sgxw2"] Apr 22 18:51:24.536897 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.536863 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-sgxw2" Apr 22 18:51:24.539633 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.539608 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-zjr6k\"" Apr 22 18:51:24.540259 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.540240 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 18:51:24.540760 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.540746 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 18:51:24.548692 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.548672 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-sgxw2"] Apr 22 18:51:24.691038 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.691004 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d18be7a-4e59-4dc0-a7da-256ae7a9d154-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-sgxw2\" (UID: \"4d18be7a-4e59-4dc0-a7da-256ae7a9d154\") " pod="cert-manager/cert-manager-cainjector-68b757865b-sgxw2" Apr 22 18:51:24.691199 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.691069 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txsc6\" (UniqueName: \"kubernetes.io/projected/4d18be7a-4e59-4dc0-a7da-256ae7a9d154-kube-api-access-txsc6\") pod \"cert-manager-cainjector-68b757865b-sgxw2\" (UID: \"4d18be7a-4e59-4dc0-a7da-256ae7a9d154\") " pod="cert-manager/cert-manager-cainjector-68b757865b-sgxw2" Apr 22 18:51:24.791627 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.791541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d18be7a-4e59-4dc0-a7da-256ae7a9d154-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-sgxw2\" (UID: \"4d18be7a-4e59-4dc0-a7da-256ae7a9d154\") " pod="cert-manager/cert-manager-cainjector-68b757865b-sgxw2" Apr 22 18:51:24.791627 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.791607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txsc6\" (UniqueName: \"kubernetes.io/projected/4d18be7a-4e59-4dc0-a7da-256ae7a9d154-kube-api-access-txsc6\") pod \"cert-manager-cainjector-68b757865b-sgxw2\" (UID: \"4d18be7a-4e59-4dc0-a7da-256ae7a9d154\") " pod="cert-manager/cert-manager-cainjector-68b757865b-sgxw2" Apr 22 18:51:24.801811 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.801766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d18be7a-4e59-4dc0-a7da-256ae7a9d154-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-sgxw2\" (UID: \"4d18be7a-4e59-4dc0-a7da-256ae7a9d154\") " pod="cert-manager/cert-manager-cainjector-68b757865b-sgxw2" Apr 22 18:51:24.802022 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.802003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txsc6\" (UniqueName: \"kubernetes.io/projected/4d18be7a-4e59-4dc0-a7da-256ae7a9d154-kube-api-access-txsc6\") pod \"cert-manager-cainjector-68b757865b-sgxw2\" (UID: \"4d18be7a-4e59-4dc0-a7da-256ae7a9d154\") " pod="cert-manager/cert-manager-cainjector-68b757865b-sgxw2" Apr 22 18:51:24.863350 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.863313 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-sgxw2" Apr 22 18:51:24.987881 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:24.987838 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-sgxw2"] Apr 22 18:51:24.990104 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:51:24.990067 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d18be7a_4e59_4dc0_a7da_256ae7a9d154.slice/crio-a582c534753c22ea433c77ba394e0e848d40e70a104208484b8d1939c272a69a WatchSource:0}: Error finding container a582c534753c22ea433c77ba394e0e848d40e70a104208484b8d1939c272a69a: Status 404 returned error can't find the container with id a582c534753c22ea433c77ba394e0e848d40e70a104208484b8d1939c272a69a Apr 22 18:51:25.701845 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:25.701807 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-sgxw2" event={"ID":"4d18be7a-4e59-4dc0-a7da-256ae7a9d154","Type":"ContainerStarted","Data":"a582c534753c22ea433c77ba394e0e848d40e70a104208484b8d1939c272a69a"} Apr 22 18:51:28.712848 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:28.712813 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-sgxw2" event={"ID":"4d18be7a-4e59-4dc0-a7da-256ae7a9d154","Type":"ContainerStarted","Data":"50ad9ae4ebd3988943c3299f6319afb94cbfe11721c71911e016ca1c3507a5b9"} Apr 22 18:51:28.730602 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:28.730554 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-sgxw2" podStartSLOduration=1.975226869 podStartE2EDuration="4.730539431s" podCreationTimestamp="2026-04-22 18:51:24 +0000 UTC" firstStartedPulling="2026-04-22 18:51:24.992015478 +0000 UTC m=+521.231647816" lastFinishedPulling="2026-04-22 18:51:27.747328041 +0000 UTC m=+523.986960378" observedRunningTime="2026-04-22 18:51:28.729262123 +0000 UTC m=+524.968894482" watchObservedRunningTime="2026-04-22 18:51:28.730539431 +0000 UTC m=+524.970171791" Apr 22 18:51:50.694281 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.694244 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59"] Apr 22 18:51:50.698157 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.698138 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:50.705837 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.705627 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 18:51:50.705952 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.705843 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 18:51:50.706005 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.705975 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-srzjc\"" Apr 22 18:51:50.706061 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.706033 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 18:51:50.706621 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.706606 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 18:51:50.717056 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.717034 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59"] Apr 22 18:51:50.793055 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.793022 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32b7588b-a569-4c47-95a7-5ab772e8e085-apiservice-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-ddt59\" (UID: \"32b7588b-a569-4c47-95a7-5ab772e8e085\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:50.793055 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.793057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32b7588b-a569-4c47-95a7-5ab772e8e085-webhook-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-ddt59\" (UID: \"32b7588b-a569-4c47-95a7-5ab772e8e085\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:50.793289 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.793092 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kxvk\" (UniqueName: \"kubernetes.io/projected/32b7588b-a569-4c47-95a7-5ab772e8e085-kube-api-access-2kxvk\") pod \"opendatahub-operator-controller-manager-dd89cc56c-ddt59\" (UID: \"32b7588b-a569-4c47-95a7-5ab772e8e085\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:50.894323 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.894291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32b7588b-a569-4c47-95a7-5ab772e8e085-apiservice-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-ddt59\" (UID: \"32b7588b-a569-4c47-95a7-5ab772e8e085\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:50.894323 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.894325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32b7588b-a569-4c47-95a7-5ab772e8e085-webhook-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-ddt59\" (UID: \"32b7588b-a569-4c47-95a7-5ab772e8e085\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:50.894582 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.894360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kxvk\" (UniqueName: \"kubernetes.io/projected/32b7588b-a569-4c47-95a7-5ab772e8e085-kube-api-access-2kxvk\") pod \"opendatahub-operator-controller-manager-dd89cc56c-ddt59\" (UID: \"32b7588b-a569-4c47-95a7-5ab772e8e085\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:50.896826 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.896803 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32b7588b-a569-4c47-95a7-5ab772e8e085-webhook-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-ddt59\" (UID: \"32b7588b-a569-4c47-95a7-5ab772e8e085\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:50.896931 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.896807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32b7588b-a569-4c47-95a7-5ab772e8e085-apiservice-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-ddt59\" (UID: \"32b7588b-a569-4c47-95a7-5ab772e8e085\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:50.904376 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:50.904356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kxvk\" (UniqueName: \"kubernetes.io/projected/32b7588b-a569-4c47-95a7-5ab772e8e085-kube-api-access-2kxvk\") pod \"opendatahub-operator-controller-manager-dd89cc56c-ddt59\" (UID: \"32b7588b-a569-4c47-95a7-5ab772e8e085\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:51.008737 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:51.008700 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:51.140448 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:51.140412 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59"] Apr 22 18:51:51.143983 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:51:51.143942 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32b7588b_a569_4c47_95a7_5ab772e8e085.slice/crio-962edb2de06058b1bce36c02d4589d30f2154e116754bf1a664e35861b331ce7 WatchSource:0}: Error finding container 962edb2de06058b1bce36c02d4589d30f2154e116754bf1a664e35861b331ce7: Status 404 returned error can't find the container with id 962edb2de06058b1bce36c02d4589d30f2154e116754bf1a664e35861b331ce7 Apr 22 18:51:51.787566 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:51.787512 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" event={"ID":"32b7588b-a569-4c47-95a7-5ab772e8e085","Type":"ContainerStarted","Data":"962edb2de06058b1bce36c02d4589d30f2154e116754bf1a664e35861b331ce7"} Apr 22 18:51:53.796024 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:53.795930 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" event={"ID":"32b7588b-a569-4c47-95a7-5ab772e8e085","Type":"ContainerStarted","Data":"2f72a529d1f9bf7d9395d8d2f68d0ec2dfb18daf27256102410de478e049d40d"} Apr 22 18:51:53.796357 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:53.796058 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:51:53.843467 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:51:53.843416 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" podStartSLOduration=1.482604596 podStartE2EDuration="3.843401275s" podCreationTimestamp="2026-04-22 18:51:50 +0000 UTC" firstStartedPulling="2026-04-22 18:51:51.145536248 +0000 UTC m=+547.385168585" lastFinishedPulling="2026-04-22 18:51:53.506332912 +0000 UTC m=+549.745965264" observedRunningTime="2026-04-22 18:51:53.841531249 +0000 UTC m=+550.081163608" watchObservedRunningTime="2026-04-22 18:51:53.843401275 +0000 UTC m=+550.083033635" Apr 22 18:52:04.801442 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:04.801409 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-ddt59" Apr 22 18:52:07.991329 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:07.991289 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm"] Apr 22 18:52:07.994637 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:07.994616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:07.999550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:07.999530 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:52:08.000106 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.000086 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-kp9vn\"" Apr 22 18:52:08.001252 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.001231 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 18:52:08.001363 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.001234 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 18:52:08.001363 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.001269 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:52:08.001363 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.001321 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 18:52:08.018452 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.018430 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm"] Apr 22 18:52:08.033419 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.033398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/95296fed-c059-456c-ac9e-67cb641f1a2f-manager-config\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.033522 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.033426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95296fed-c059-456c-ac9e-67cb641f1a2f-metrics-cert\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.033522 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.033480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl9ms\" (UniqueName: \"kubernetes.io/projected/95296fed-c059-456c-ac9e-67cb641f1a2f-kube-api-access-sl9ms\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.033522 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.033499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95296fed-c059-456c-ac9e-67cb641f1a2f-cert\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.134110 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.134074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sl9ms\" (UniqueName: \"kubernetes.io/projected/95296fed-c059-456c-ac9e-67cb641f1a2f-kube-api-access-sl9ms\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.135184 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.134113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95296fed-c059-456c-ac9e-67cb641f1a2f-cert\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.135184 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.134669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/95296fed-c059-456c-ac9e-67cb641f1a2f-manager-config\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.135184 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.134721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95296fed-c059-456c-ac9e-67cb641f1a2f-metrics-cert\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.135633 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.135609 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/95296fed-c059-456c-ac9e-67cb641f1a2f-manager-config\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.138850 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.138252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95296fed-c059-456c-ac9e-67cb641f1a2f-metrics-cert\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.142992 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.142971 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95296fed-c059-456c-ac9e-67cb641f1a2f-cert\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.169682 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.169660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl9ms\" (UniqueName: \"kubernetes.io/projected/95296fed-c059-456c-ac9e-67cb641f1a2f-kube-api-access-sl9ms\") pod \"lws-controller-manager-54dc496758-2fhhm\" (UID: \"95296fed-c059-456c-ac9e-67cb641f1a2f\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.303636 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.303550 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:08.451808 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.451763 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm"] Apr 22 18:52:08.453678 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:52:08.453653 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95296fed_c059_456c_ac9e_67cb641f1a2f.slice/crio-98dc474c717966c8a880baf704674e2f3a2704da8c11290e24ed622bb50bd0dd WatchSource:0}: Error finding container 98dc474c717966c8a880baf704674e2f3a2704da8c11290e24ed622bb50bd0dd: Status 404 returned error can't find the container with id 98dc474c717966c8a880baf704674e2f3a2704da8c11290e24ed622bb50bd0dd Apr 22 18:52:08.848001 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:08.847963 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" event={"ID":"95296fed-c059-456c-ac9e-67cb641f1a2f","Type":"ContainerStarted","Data":"98dc474c717966c8a880baf704674e2f3a2704da8c11290e24ed622bb50bd0dd"} Apr 22 18:52:09.102207 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.101987 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g"] Apr 22 18:52:09.105433 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.105405 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" Apr 22 18:52:09.108272 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.108249 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 18:52:09.108440 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.108411 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 18:52:09.109542 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.109518 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-dthq8\"" Apr 22 18:52:09.136789 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.136752 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g"] Apr 22 18:52:09.244742 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.244710 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xccwc\" (UniqueName: \"kubernetes.io/projected/51f9655e-566c-4ec9-847b-ea96b2a6b6c1-kube-api-access-xccwc\") pod \"kube-auth-proxy-7cff94f675-fwl9g\" (UID: \"51f9655e-566c-4ec9-847b-ea96b2a6b6c1\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" Apr 22 18:52:09.244941 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.244752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51f9655e-566c-4ec9-847b-ea96b2a6b6c1-tls-certs\") pod \"kube-auth-proxy-7cff94f675-fwl9g\" (UID: \"51f9655e-566c-4ec9-847b-ea96b2a6b6c1\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" Apr 22 18:52:09.244941 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.244821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51f9655e-566c-4ec9-847b-ea96b2a6b6c1-tmp\") pod \"kube-auth-proxy-7cff94f675-fwl9g\" (UID: \"51f9655e-566c-4ec9-847b-ea96b2a6b6c1\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" Apr 22 18:52:09.346117 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.346079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xccwc\" (UniqueName: \"kubernetes.io/projected/51f9655e-566c-4ec9-847b-ea96b2a6b6c1-kube-api-access-xccwc\") pod \"kube-auth-proxy-7cff94f675-fwl9g\" (UID: \"51f9655e-566c-4ec9-847b-ea96b2a6b6c1\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" Apr 22 18:52:09.346310 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.346151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51f9655e-566c-4ec9-847b-ea96b2a6b6c1-tls-certs\") pod \"kube-auth-proxy-7cff94f675-fwl9g\" (UID: \"51f9655e-566c-4ec9-847b-ea96b2a6b6c1\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" Apr 22 18:52:09.346310 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.346189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51f9655e-566c-4ec9-847b-ea96b2a6b6c1-tmp\") pod \"kube-auth-proxy-7cff94f675-fwl9g\" (UID: \"51f9655e-566c-4ec9-847b-ea96b2a6b6c1\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" Apr 22 18:52:09.348755 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.348733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51f9655e-566c-4ec9-847b-ea96b2a6b6c1-tmp\") pod \"kube-auth-proxy-7cff94f675-fwl9g\" (UID: \"51f9655e-566c-4ec9-847b-ea96b2a6b6c1\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" Apr 22 18:52:09.348970 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.348949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51f9655e-566c-4ec9-847b-ea96b2a6b6c1-tls-certs\") pod \"kube-auth-proxy-7cff94f675-fwl9g\" (UID: \"51f9655e-566c-4ec9-847b-ea96b2a6b6c1\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" Apr 22 18:52:09.356975 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.356899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xccwc\" (UniqueName: \"kubernetes.io/projected/51f9655e-566c-4ec9-847b-ea96b2a6b6c1-kube-api-access-xccwc\") pod \"kube-auth-proxy-7cff94f675-fwl9g\" (UID: \"51f9655e-566c-4ec9-847b-ea96b2a6b6c1\") " pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" Apr 22 18:52:09.417429 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.417396 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" Apr 22 18:52:09.575209 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.575185 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g"] Apr 22 18:52:09.578339 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:52:09.578306 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f9655e_566c_4ec9_847b_ea96b2a6b6c1.slice/crio-2781698c6ae6b248edd32022a6e390ae7fda1245e76df0ca5ec8f96b3e4b65f6 WatchSource:0}: Error finding container 2781698c6ae6b248edd32022a6e390ae7fda1245e76df0ca5ec8f96b3e4b65f6: Status 404 returned error can't find the container with id 2781698c6ae6b248edd32022a6e390ae7fda1245e76df0ca5ec8f96b3e4b65f6 Apr 22 18:52:09.852533 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:09.852501 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" event={"ID":"51f9655e-566c-4ec9-847b-ea96b2a6b6c1","Type":"ContainerStarted","Data":"2781698c6ae6b248edd32022a6e390ae7fda1245e76df0ca5ec8f96b3e4b65f6"} Apr 22 18:52:13.866841 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:13.866800 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" event={"ID":"51f9655e-566c-4ec9-847b-ea96b2a6b6c1","Type":"ContainerStarted","Data":"fb7686cb248121b2606f93fbab2227c517f9bd736b9ac6d384ab27fd3fb9796d"} Apr 22 18:52:13.868121 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:13.868098 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" event={"ID":"95296fed-c059-456c-ac9e-67cb641f1a2f","Type":"ContainerStarted","Data":"53ffedd50fb63af1261701795b679f69aba258a62381225cb73c7641d4399a10"} Apr 22 18:52:13.868235 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:13.868224 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:13.884525 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:13.884483 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7cff94f675-fwl9g" podStartSLOduration=1.613296083 podStartE2EDuration="4.884470148s" podCreationTimestamp="2026-04-22 18:52:09 +0000 UTC" firstStartedPulling="2026-04-22 18:52:09.580657547 +0000 UTC m=+565.820289899" lastFinishedPulling="2026-04-22 18:52:12.851831614 +0000 UTC m=+569.091463964" observedRunningTime="2026-04-22 18:52:13.883878523 +0000 UTC m=+570.123510882" watchObservedRunningTime="2026-04-22 18:52:13.884470148 +0000 UTC m=+570.124102508" Apr 22 18:52:13.906530 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:13.906474 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" podStartSLOduration=2.555221478 podStartE2EDuration="6.906455952s" podCreationTimestamp="2026-04-22 18:52:07 +0000 UTC" firstStartedPulling="2026-04-22 18:52:08.455427281 +0000 UTC m=+564.695059618" lastFinishedPulling="2026-04-22 18:52:12.806661745 +0000 UTC m=+569.046294092" observedRunningTime="2026-04-22 18:52:13.90596594 +0000 UTC m=+570.145598313" watchObservedRunningTime="2026-04-22 18:52:13.906455952 +0000 UTC m=+570.146088317" Apr 22 18:52:24.873750 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:24.873718 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-54dc496758-2fhhm" Apr 22 18:52:44.121373 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:44.121344 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 18:52:44.123678 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:44.123656 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 18:52:44.127247 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:44.127225 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 18:52:44.129460 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:52:44.129444 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 18:54:00.389793 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.389743 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh"] Apr 22 18:54:00.393249 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.393231 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" Apr 22 18:54:00.397292 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.397270 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:54:00.397841 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.397821 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:54:00.397951 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.397826 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-brdcv\"" Apr 22 18:54:00.412092 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.412065 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh"] Apr 22 18:54:00.438904 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.438878 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh"] Apr 22 18:54:00.439115 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:54:00.439097 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-qp9kt], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" podUID="73a3aaf0-9316-456c-ac04-3cb7103a8db1" Apr 22 18:54:00.447168 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.447138 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh"] Apr 22 18:54:00.450825 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.450797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/73a3aaf0-9316-456c-ac04-3cb7103a8db1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-r7cmh\" (UID: \"73a3aaf0-9316-456c-ac04-3cb7103a8db1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" Apr 22 18:54:00.450945 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.450852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp9kt\" (UniqueName: \"kubernetes.io/projected/73a3aaf0-9316-456c-ac04-3cb7103a8db1-kube-api-access-qp9kt\") pod \"kuadrant-operator-controller-manager-55c7f4c975-r7cmh\" (UID: \"73a3aaf0-9316-456c-ac04-3cb7103a8db1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" Apr 22 18:54:00.460288 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.460267 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l"] Apr 22 18:54:00.463550 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.463533 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 18:54:00.476945 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.476924 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l"] Apr 22 18:54:00.551803 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.551751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/16a312c7-51c8-4f45-94f1-7dea30519951-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7wl6l\" (UID: \"16a312c7-51c8-4f45-94f1-7dea30519951\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 18:54:00.552006 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.551844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp9kt\" (UniqueName: \"kubernetes.io/projected/73a3aaf0-9316-456c-ac04-3cb7103a8db1-kube-api-access-qp9kt\") pod \"kuadrant-operator-controller-manager-55c7f4c975-r7cmh\" (UID: \"73a3aaf0-9316-456c-ac04-3cb7103a8db1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" Apr 22 18:54:00.552006 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.551952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/73a3aaf0-9316-456c-ac04-3cb7103a8db1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-r7cmh\" (UID: \"73a3aaf0-9316-456c-ac04-3cb7103a8db1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" Apr 22 18:54:00.552006 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.551979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbgxn\" (UniqueName: \"kubernetes.io/projected/16a312c7-51c8-4f45-94f1-7dea30519951-kube-api-access-rbgxn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7wl6l\" (UID: \"16a312c7-51c8-4f45-94f1-7dea30519951\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 18:54:00.552424 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.552397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/73a3aaf0-9316-456c-ac04-3cb7103a8db1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-r7cmh\" (UID: \"73a3aaf0-9316-456c-ac04-3cb7103a8db1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" Apr 22 18:54:00.556575 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:54:00.556553 2575 projected.go:194] Error preparing data for projected volume kube-api-access-qp9kt for pod kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh: failed to fetch token: pod "kuadrant-operator-controller-manager-55c7f4c975-r7cmh" not found Apr 22 18:54:00.556669 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:54:00.556633 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a3aaf0-9316-456c-ac04-3cb7103a8db1-kube-api-access-qp9kt podName:73a3aaf0-9316-456c-ac04-3cb7103a8db1 nodeName:}" failed. No retries permitted until 2026-04-22 18:54:01.056613757 +0000 UTC m=+677.296246097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qp9kt" (UniqueName: "kubernetes.io/projected/73a3aaf0-9316-456c-ac04-3cb7103a8db1-kube-api-access-qp9kt") pod "kuadrant-operator-controller-manager-55c7f4c975-r7cmh" (UID: "73a3aaf0-9316-456c-ac04-3cb7103a8db1") : failed to fetch token: pod "kuadrant-operator-controller-manager-55c7f4c975-r7cmh" not found Apr 22 18:54:00.653347 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.653264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/16a312c7-51c8-4f45-94f1-7dea30519951-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7wl6l\" (UID: \"16a312c7-51c8-4f45-94f1-7dea30519951\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 18:54:00.653479 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.653371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbgxn\" (UniqueName: \"kubernetes.io/projected/16a312c7-51c8-4f45-94f1-7dea30519951-kube-api-access-rbgxn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7wl6l\" (UID: \"16a312c7-51c8-4f45-94f1-7dea30519951\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 18:54:00.653624 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.653605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/16a312c7-51c8-4f45-94f1-7dea30519951-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7wl6l\" (UID: \"16a312c7-51c8-4f45-94f1-7dea30519951\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 18:54:00.662745 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.662725 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbgxn\" (UniqueName: \"kubernetes.io/projected/16a312c7-51c8-4f45-94f1-7dea30519951-kube-api-access-rbgxn\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7wl6l\" (UID: \"16a312c7-51c8-4f45-94f1-7dea30519951\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 18:54:00.773832 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.773766 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 18:54:00.897522 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:00.897497 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l"] Apr 22 18:54:00.899806 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:54:00.899758 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16a312c7_51c8_4f45_94f1_7dea30519951.slice/crio-456f4ffc72d899efc9bdc850d4eff203ab1806c8754fcf09d6c8927239c13763 WatchSource:0}: Error finding container 456f4ffc72d899efc9bdc850d4eff203ab1806c8754fcf09d6c8927239c13763: Status 404 returned error can't find the container with id 456f4ffc72d899efc9bdc850d4eff203ab1806c8754fcf09d6c8927239c13763 Apr 22 18:54:01.056868 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:01.056822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp9kt\" (UniqueName: \"kubernetes.io/projected/73a3aaf0-9316-456c-ac04-3cb7103a8db1-kube-api-access-qp9kt\") pod \"kuadrant-operator-controller-manager-55c7f4c975-r7cmh\" (UID: \"73a3aaf0-9316-456c-ac04-3cb7103a8db1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" Apr 22 18:54:01.059230 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:54:01.059212 2575 projected.go:194] Error preparing data for projected volume kube-api-access-qp9kt for pod kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh: failed to fetch token: pod "kuadrant-operator-controller-manager-55c7f4c975-r7cmh" not found Apr 22 18:54:01.059290 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:54:01.059279 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a3aaf0-9316-456c-ac04-3cb7103a8db1-kube-api-access-qp9kt podName:73a3aaf0-9316-456c-ac04-3cb7103a8db1 nodeName:}" failed. No retries permitted until 2026-04-22 18:54:02.059264554 +0000 UTC m=+678.298896892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qp9kt" (UniqueName: "kubernetes.io/projected/73a3aaf0-9316-456c-ac04-3cb7103a8db1-kube-api-access-qp9kt") pod "kuadrant-operator-controller-manager-55c7f4c975-r7cmh" (UID: "73a3aaf0-9316-456c-ac04-3cb7103a8db1") : failed to fetch token: pod "kuadrant-operator-controller-manager-55c7f4c975-r7cmh" not found Apr 22 18:54:01.215963 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:01.215928 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" event={"ID":"16a312c7-51c8-4f45-94f1-7dea30519951","Type":"ContainerStarted","Data":"456f4ffc72d899efc9bdc850d4eff203ab1806c8754fcf09d6c8927239c13763"} Apr 22 18:54:01.215963 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:01.215964 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" Apr 22 18:54:01.218528 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:01.218498 2575 status_manager.go:895] "Failed to get status for pod" podUID="73a3aaf0-9316-456c-ac04-3cb7103a8db1" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-r7cmh\" is forbidden: User \"system:node:ip-10-0-130-32.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-32.ec2.internal' and this object" Apr 22 18:54:01.220800 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:01.220768 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" Apr 22 18:54:01.222683 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:01.222656 2575 status_manager.go:895] "Failed to get status for pod" podUID="73a3aaf0-9316-456c-ac04-3cb7103a8db1" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-r7cmh\" is forbidden: User \"system:node:ip-10-0-130-32.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-32.ec2.internal' and this object" Apr 22 18:54:01.258135 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:01.258114 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/73a3aaf0-9316-456c-ac04-3cb7103a8db1-extensions-socket-volume\") pod \"73a3aaf0-9316-456c-ac04-3cb7103a8db1\" (UID: \"73a3aaf0-9316-456c-ac04-3cb7103a8db1\") " Apr 22 18:54:01.258341 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:01.258327 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qp9kt\" (UniqueName: \"kubernetes.io/projected/73a3aaf0-9316-456c-ac04-3cb7103a8db1-kube-api-access-qp9kt\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:54:01.258405 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:01.258344 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a3aaf0-9316-456c-ac04-3cb7103a8db1-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "73a3aaf0-9316-456c-ac04-3cb7103a8db1" (UID: "73a3aaf0-9316-456c-ac04-3cb7103a8db1"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:01.359642 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:01.359613 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/73a3aaf0-9316-456c-ac04-3cb7103a8db1-extensions-socket-volume\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:54:02.194325 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:02.194287 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a3aaf0-9316-456c-ac04-3cb7103a8db1" path="/var/lib/kubelet/pods/73a3aaf0-9316-456c-ac04-3cb7103a8db1/volumes" Apr 22 18:54:02.220787 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:02.220750 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" Apr 22 18:54:02.225127 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:02.225094 2575 status_manager.go:895] "Failed to get status for pod" podUID="73a3aaf0-9316-456c-ac04-3cb7103a8db1" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-r7cmh\" is forbidden: User \"system:node:ip-10-0-130-32.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-32.ec2.internal' and this object" Apr 22 18:54:04.193717 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:04.193681 2575 status_manager.go:895] "Failed to get status for pod" podUID="73a3aaf0-9316-456c-ac04-3cb7103a8db1" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-r7cmh" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-r7cmh\" is forbidden: User \"system:node:ip-10-0-130-32.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-32.ec2.internal' and this object" Apr 22 18:54:06.235029 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:06.234995 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" event={"ID":"16a312c7-51c8-4f45-94f1-7dea30519951","Type":"ContainerStarted","Data":"8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb"} Apr 22 18:54:06.235421 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:06.235102 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 18:54:06.268499 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:06.268449 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" podStartSLOduration=1.539348154 podStartE2EDuration="6.268434227s" podCreationTimestamp="2026-04-22 18:54:00 +0000 UTC" firstStartedPulling="2026-04-22 18:54:00.902051305 +0000 UTC m=+677.141683647" lastFinishedPulling="2026-04-22 18:54:05.631137383 +0000 UTC m=+681.870769720" observedRunningTime="2026-04-22 18:54:06.267087783 +0000 UTC m=+682.506720141" watchObservedRunningTime="2026-04-22 18:54:06.268434227 +0000 UTC m=+682.508066588" Apr 22 18:54:17.240750 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:17.240718 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 18:54:39.127495 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:39.127464 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jbnmj"] Apr 22 18:54:39.134895 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:39.134857 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jbnmj"] Apr 22 18:54:39.135047 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:39.134954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" Apr 22 18:54:39.137347 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:39.137324 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-jshl5\"" Apr 22 18:54:39.168803 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:39.168759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgj4f\" (UniqueName: \"kubernetes.io/projected/edcf10cf-7570-4981-b524-1b7f6e7bf901-kube-api-access-zgj4f\") pod \"authorino-f99f4b5cd-jbnmj\" (UID: \"edcf10cf-7570-4981-b524-1b7f6e7bf901\") " pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" Apr 22 18:54:39.269380 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:39.269350 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgj4f\" (UniqueName: \"kubernetes.io/projected/edcf10cf-7570-4981-b524-1b7f6e7bf901-kube-api-access-zgj4f\") pod \"authorino-f99f4b5cd-jbnmj\" (UID: \"edcf10cf-7570-4981-b524-1b7f6e7bf901\") " pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" Apr 22 18:54:39.285220 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:39.285194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgj4f\" (UniqueName: \"kubernetes.io/projected/edcf10cf-7570-4981-b524-1b7f6e7bf901-kube-api-access-zgj4f\") pod \"authorino-f99f4b5cd-jbnmj\" (UID: \"edcf10cf-7570-4981-b524-1b7f6e7bf901\") " pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" Apr 22 18:54:39.446433 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:39.446347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" Apr 22 18:54:39.560525 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:39.560505 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jbnmj"] Apr 22 18:54:39.563153 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:54:39.563125 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedcf10cf_7570_4981_b524_1b7f6e7bf901.slice/crio-31efdea09ef17b3e47789ec5dc20b11463521b488086980fa24d5476c0d325f4 WatchSource:0}: Error finding container 31efdea09ef17b3e47789ec5dc20b11463521b488086980fa24d5476c0d325f4: Status 404 returned error can't find the container with id 31efdea09ef17b3e47789ec5dc20b11463521b488086980fa24d5476c0d325f4 Apr 22 18:54:40.344481 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:40.344417 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" event={"ID":"edcf10cf-7570-4981-b524-1b7f6e7bf901","Type":"ContainerStarted","Data":"31efdea09ef17b3e47789ec5dc20b11463521b488086980fa24d5476c0d325f4"} Apr 22 18:54:43.359201 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:43.359162 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" event={"ID":"edcf10cf-7570-4981-b524-1b7f6e7bf901","Type":"ContainerStarted","Data":"bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524"} Apr 22 18:54:43.373145 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:43.373102 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" podStartSLOduration=1.279848896 podStartE2EDuration="4.373089399s" podCreationTimestamp="2026-04-22 18:54:39 +0000 UTC" firstStartedPulling="2026-04-22 18:54:39.564352596 +0000 UTC m=+715.803984932" lastFinishedPulling="2026-04-22 18:54:42.657593092 +0000 UTC m=+718.897225435" observedRunningTime="2026-04-22 18:54:43.373076539 +0000 UTC m=+719.612708898" watchObservedRunningTime="2026-04-22 18:54:43.373089399 +0000 UTC m=+719.612721757" Apr 22 18:54:45.908916 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:45.908886 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jbnmj"] Apr 22 18:54:45.909383 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:45.909105 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" podUID="edcf10cf-7570-4981-b524-1b7f6e7bf901" containerName="authorino" containerID="cri-o://bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524" gracePeriod=30 Apr 22 18:54:46.153596 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.153573 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" Apr 22 18:54:46.231674 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.231650 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgj4f\" (UniqueName: \"kubernetes.io/projected/edcf10cf-7570-4981-b524-1b7f6e7bf901-kube-api-access-zgj4f\") pod \"edcf10cf-7570-4981-b524-1b7f6e7bf901\" (UID: \"edcf10cf-7570-4981-b524-1b7f6e7bf901\") " Apr 22 18:54:46.233797 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.233742 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edcf10cf-7570-4981-b524-1b7f6e7bf901-kube-api-access-zgj4f" (OuterVolumeSpecName: "kube-api-access-zgj4f") pod "edcf10cf-7570-4981-b524-1b7f6e7bf901" (UID: "edcf10cf-7570-4981-b524-1b7f6e7bf901"). InnerVolumeSpecName "kube-api-access-zgj4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:46.332300 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.332272 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zgj4f\" (UniqueName: \"kubernetes.io/projected/edcf10cf-7570-4981-b524-1b7f6e7bf901-kube-api-access-zgj4f\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:54:46.369888 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.369854 2575 generic.go:358] "Generic (PLEG): container finished" podID="edcf10cf-7570-4981-b524-1b7f6e7bf901" containerID="bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524" exitCode=0 Apr 22 18:54:46.370016 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.369901 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" Apr 22 18:54:46.370016 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.369944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" event={"ID":"edcf10cf-7570-4981-b524-1b7f6e7bf901","Type":"ContainerDied","Data":"bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524"} Apr 22 18:54:46.370016 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.369982 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jbnmj" event={"ID":"edcf10cf-7570-4981-b524-1b7f6e7bf901","Type":"ContainerDied","Data":"31efdea09ef17b3e47789ec5dc20b11463521b488086980fa24d5476c0d325f4"} Apr 22 18:54:46.370016 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.369996 2575 scope.go:117] "RemoveContainer" containerID="bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524" Apr 22 18:54:46.378220 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.378205 2575 scope.go:117] "RemoveContainer" containerID="bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524" Apr 22 18:54:46.378469 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:54:46.378446 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524\": container with ID starting with bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524 not found: ID does not exist" containerID="bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524" Apr 22 18:54:46.378538 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.378479 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524"} err="failed to get container status \"bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524\": rpc error: code = NotFound desc = could not find container \"bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524\": container with ID starting with bb72b10de3cb34b1e64ba3297d327e76246071c699ea3a028123af28035ec524 not found: ID does not exist" Apr 22 18:54:46.394024 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.394001 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jbnmj"] Apr 22 18:54:46.398549 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:46.398526 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jbnmj"] Apr 22 18:54:48.194464 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:54:48.194430 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edcf10cf-7570-4981-b524-1b7f6e7bf901" path="/var/lib/kubelet/pods/edcf10cf-7570-4981-b524-1b7f6e7bf901/volumes" Apr 22 18:55:12.521171 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.521137 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-6mfmx"] Apr 22 18:55:12.521651 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.521438 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edcf10cf-7570-4981-b524-1b7f6e7bf901" containerName="authorino" Apr 22 18:55:12.521651 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.521449 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="edcf10cf-7570-4981-b524-1b7f6e7bf901" containerName="authorino" Apr 22 18:55:12.521651 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.521536 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="edcf10cf-7570-4981-b524-1b7f6e7bf901" containerName="authorino" Apr 22 18:55:12.524592 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.524571 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-6mfmx" Apr 22 18:55:12.526849 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.526828 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-jshl5\"" Apr 22 18:55:12.530725 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.530698 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-6mfmx"] Apr 22 18:55:12.656321 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.656292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crl9b\" (UniqueName: \"kubernetes.io/projected/af2c914c-2d3c-42a0-8ecd-264a3a05df71-kube-api-access-crl9b\") pod \"authorino-8b475cf9f-6mfmx\" (UID: \"af2c914c-2d3c-42a0-8ecd-264a3a05df71\") " pod="kuadrant-system/authorino-8b475cf9f-6mfmx" Apr 22 18:55:12.744896 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.744864 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-6mfmx"] Apr 22 18:55:12.745087 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:55:12.745069 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-crl9b], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-6mfmx" podUID="af2c914c-2d3c-42a0-8ecd-264a3a05df71" Apr 22 18:55:12.757481 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.757452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crl9b\" (UniqueName: \"kubernetes.io/projected/af2c914c-2d3c-42a0-8ecd-264a3a05df71-kube-api-access-crl9b\") pod \"authorino-8b475cf9f-6mfmx\" (UID: \"af2c914c-2d3c-42a0-8ecd-264a3a05df71\") " pod="kuadrant-system/authorino-8b475cf9f-6mfmx" Apr 22 18:55:12.766128 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.766102 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crl9b\" (UniqueName: \"kubernetes.io/projected/af2c914c-2d3c-42a0-8ecd-264a3a05df71-kube-api-access-crl9b\") pod \"authorino-8b475cf9f-6mfmx\" (UID: \"af2c914c-2d3c-42a0-8ecd-264a3a05df71\") " pod="kuadrant-system/authorino-8b475cf9f-6mfmx" Apr 22 18:55:12.770953 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.770934 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bf4f4944-6gqxj"] Apr 22 18:55:12.774325 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.774279 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bf4f4944-6gqxj" Apr 22 18:55:12.784562 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.784537 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bf4f4944-6gqxj"] Apr 22 18:55:12.858286 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.858255 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlw6t\" (UniqueName: \"kubernetes.io/projected/4b7d8fb8-69de-479d-8005-5d4215584a17-kube-api-access-dlw6t\") pod \"authorino-68bf4f4944-6gqxj\" (UID: \"4b7d8fb8-69de-479d-8005-5d4215584a17\") " pod="kuadrant-system/authorino-68bf4f4944-6gqxj" Apr 22 18:55:12.958975 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.958942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlw6t\" (UniqueName: \"kubernetes.io/projected/4b7d8fb8-69de-479d-8005-5d4215584a17-kube-api-access-dlw6t\") pod \"authorino-68bf4f4944-6gqxj\" (UID: \"4b7d8fb8-69de-479d-8005-5d4215584a17\") " pod="kuadrant-system/authorino-68bf4f4944-6gqxj" Apr 22 18:55:12.967448 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.967421 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlw6t\" (UniqueName: \"kubernetes.io/projected/4b7d8fb8-69de-479d-8005-5d4215584a17-kube-api-access-dlw6t\") pod \"authorino-68bf4f4944-6gqxj\" (UID: \"4b7d8fb8-69de-479d-8005-5d4215584a17\") " pod="kuadrant-system/authorino-68bf4f4944-6gqxj" Apr 22 18:55:12.999826 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:12.999800 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-68bf4f4944-6gqxj"] Apr 22 18:55:13.000015 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:13.000003 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bf4f4944-6gqxj" Apr 22 18:55:13.117404 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:13.117376 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-68bf4f4944-6gqxj"] Apr 22 18:55:13.119683 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:55:13.119656 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b7d8fb8_69de_479d_8005_5d4215584a17.slice/crio-b37d9ed9659f032b4f7ea4bcdd938f8e70f94f9e83d64620c35f8442a36e46bb WatchSource:0}: Error finding container b37d9ed9659f032b4f7ea4bcdd938f8e70f94f9e83d64620c35f8442a36e46bb: Status 404 returned error can't find the container with id b37d9ed9659f032b4f7ea4bcdd938f8e70f94f9e83d64620c35f8442a36e46bb Apr 22 18:55:13.120984 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:13.120963 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:55:13.464514 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:13.464479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bf4f4944-6gqxj" event={"ID":"4b7d8fb8-69de-479d-8005-5d4215584a17","Type":"ContainerStarted","Data":"b37d9ed9659f032b4f7ea4bcdd938f8e70f94f9e83d64620c35f8442a36e46bb"} Apr 22 18:55:13.464514 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:13.464508 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-6mfmx" Apr 22 18:55:13.469258 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:13.469240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-6mfmx" Apr 22 18:55:13.563692 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:13.563674 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crl9b\" (UniqueName: \"kubernetes.io/projected/af2c914c-2d3c-42a0-8ecd-264a3a05df71-kube-api-access-crl9b\") pod \"af2c914c-2d3c-42a0-8ecd-264a3a05df71\" (UID: \"af2c914c-2d3c-42a0-8ecd-264a3a05df71\") " Apr 22 18:55:13.565424 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:13.565393 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2c914c-2d3c-42a0-8ecd-264a3a05df71-kube-api-access-crl9b" (OuterVolumeSpecName: "kube-api-access-crl9b") pod "af2c914c-2d3c-42a0-8ecd-264a3a05df71" (UID: "af2c914c-2d3c-42a0-8ecd-264a3a05df71"). InnerVolumeSpecName "kube-api-access-crl9b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:13.665193 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:13.665148 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crl9b\" (UniqueName: \"kubernetes.io/projected/af2c914c-2d3c-42a0-8ecd-264a3a05df71-kube-api-access-crl9b\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:55:14.468898 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:14.468865 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bf4f4944-6gqxj" event={"ID":"4b7d8fb8-69de-479d-8005-5d4215584a17","Type":"ContainerStarted","Data":"d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f"} Apr 22 18:55:14.469083 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:14.468893 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-68bf4f4944-6gqxj" podUID="4b7d8fb8-69de-479d-8005-5d4215584a17" containerName="authorino" containerID="cri-o://d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f" gracePeriod=30 Apr 22 18:55:14.469083 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:14.468908 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-6mfmx" Apr 22 18:55:14.484218 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:14.484176 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bf4f4944-6gqxj" podStartSLOduration=2.093167067 podStartE2EDuration="2.484164758s" podCreationTimestamp="2026-04-22 18:55:12 +0000 UTC" firstStartedPulling="2026-04-22 18:55:13.121081333 +0000 UTC m=+749.360713670" lastFinishedPulling="2026-04-22 18:55:13.51207901 +0000 UTC m=+749.751711361" observedRunningTime="2026-04-22 18:55:14.483097096 +0000 UTC m=+750.722729446" watchObservedRunningTime="2026-04-22 18:55:14.484164758 +0000 UTC m=+750.723797094" Apr 22 18:55:14.505596 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:14.505568 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-6mfmx"] Apr 22 18:55:14.508858 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:14.508831 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-6mfmx"] Apr 22 18:55:14.705739 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:14.705717 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bf4f4944-6gqxj" Apr 22 18:55:14.775148 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:14.775075 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlw6t\" (UniqueName: \"kubernetes.io/projected/4b7d8fb8-69de-479d-8005-5d4215584a17-kube-api-access-dlw6t\") pod \"4b7d8fb8-69de-479d-8005-5d4215584a17\" (UID: \"4b7d8fb8-69de-479d-8005-5d4215584a17\") " Apr 22 18:55:14.777104 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:14.777079 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7d8fb8-69de-479d-8005-5d4215584a17-kube-api-access-dlw6t" (OuterVolumeSpecName: "kube-api-access-dlw6t") pod "4b7d8fb8-69de-479d-8005-5d4215584a17" (UID: "4b7d8fb8-69de-479d-8005-5d4215584a17"). InnerVolumeSpecName "kube-api-access-dlw6t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:14.876081 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:14.876052 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dlw6t\" (UniqueName: \"kubernetes.io/projected/4b7d8fb8-69de-479d-8005-5d4215584a17-kube-api-access-dlw6t\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:55:15.122436 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.122359 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5d44dbddb-f6vlm"] Apr 22 18:55:15.122698 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.122684 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b7d8fb8-69de-479d-8005-5d4215584a17" containerName="authorino" Apr 22 18:55:15.122698 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.122699 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7d8fb8-69de-479d-8005-5d4215584a17" containerName="authorino" Apr 22 18:55:15.123049 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.122805 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b7d8fb8-69de-479d-8005-5d4215584a17" containerName="authorino" Apr 22 18:55:15.125829 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.125814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" Apr 22 18:55:15.128225 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.128203 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-jh8rj\"" Apr 22 18:55:15.132311 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.132286 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5d44dbddb-f6vlm"] Apr 22 18:55:15.284666 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.284631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mcs9\" (UniqueName: \"kubernetes.io/projected/68d063f1-e405-47e8-9358-5c56110901a2-kube-api-access-7mcs9\") pod \"maas-controller-5d44dbddb-f6vlm\" (UID: \"68d063f1-e405-47e8-9358-5c56110901a2\") " pod="opendatahub/maas-controller-5d44dbddb-f6vlm" Apr 22 18:55:15.385207 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.385117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mcs9\" (UniqueName: \"kubernetes.io/projected/68d063f1-e405-47e8-9358-5c56110901a2-kube-api-access-7mcs9\") pod \"maas-controller-5d44dbddb-f6vlm\" (UID: \"68d063f1-e405-47e8-9358-5c56110901a2\") " pod="opendatahub/maas-controller-5d44dbddb-f6vlm" Apr 22 18:55:15.394034 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.394010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mcs9\" (UniqueName: \"kubernetes.io/projected/68d063f1-e405-47e8-9358-5c56110901a2-kube-api-access-7mcs9\") pod \"maas-controller-5d44dbddb-f6vlm\" (UID: \"68d063f1-e405-47e8-9358-5c56110901a2\") " pod="opendatahub/maas-controller-5d44dbddb-f6vlm" Apr 22 18:55:15.437023 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.436962 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" Apr 22 18:55:15.477421 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.477387 2575 generic.go:358] "Generic (PLEG): container finished" podID="4b7d8fb8-69de-479d-8005-5d4215584a17" containerID="d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f" exitCode=0 Apr 22 18:55:15.477582 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.477449 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bf4f4944-6gqxj" Apr 22 18:55:15.477582 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.477478 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bf4f4944-6gqxj" event={"ID":"4b7d8fb8-69de-479d-8005-5d4215584a17","Type":"ContainerDied","Data":"d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f"} Apr 22 18:55:15.477582 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.477526 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bf4f4944-6gqxj" event={"ID":"4b7d8fb8-69de-479d-8005-5d4215584a17","Type":"ContainerDied","Data":"b37d9ed9659f032b4f7ea4bcdd938f8e70f94f9e83d64620c35f8442a36e46bb"} Apr 22 18:55:15.477582 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.477546 2575 scope.go:117] "RemoveContainer" containerID="d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f" Apr 22 18:55:15.489133 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.489111 2575 scope.go:117] "RemoveContainer" containerID="d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f" Apr 22 18:55:15.489439 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:55:15.489417 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f\": container with ID starting with d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f not found: ID does not exist" containerID="d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f" Apr 22 18:55:15.489540 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.489452 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f"} err="failed to get container status \"d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f\": rpc error: code = NotFound desc = could not find container \"d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f\": container with ID starting with d893fa326bd139b75e5ae68e77a1de2789bb13a19a3b66526f6db6d11c89745f not found: ID does not exist" Apr 22 18:55:15.501036 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.501007 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-68bf4f4944-6gqxj"] Apr 22 18:55:15.504262 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.504239 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-68bf4f4944-6gqxj"] Apr 22 18:55:15.557141 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:15.557114 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5d44dbddb-f6vlm"] Apr 22 18:55:15.559751 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:55:15.559725 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d063f1_e405_47e8_9358_5c56110901a2.slice/crio-fa46f2009bf8befce0c9ebfdbcfa7c08118baeca0f44d5072317a5b425b45074 WatchSource:0}: Error finding container fa46f2009bf8befce0c9ebfdbcfa7c08118baeca0f44d5072317a5b425b45074: Status 404 returned error can't find the container with id fa46f2009bf8befce0c9ebfdbcfa7c08118baeca0f44d5072317a5b425b45074 Apr 22 18:55:16.196017 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:16.195961 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b7d8fb8-69de-479d-8005-5d4215584a17" path="/var/lib/kubelet/pods/4b7d8fb8-69de-479d-8005-5d4215584a17/volumes" Apr 22 18:55:16.196500 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:16.196433 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af2c914c-2d3c-42a0-8ecd-264a3a05df71" path="/var/lib/kubelet/pods/af2c914c-2d3c-42a0-8ecd-264a3a05df71/volumes" Apr 22 18:55:16.482193 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:16.482152 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" event={"ID":"68d063f1-e405-47e8-9358-5c56110901a2","Type":"ContainerStarted","Data":"fa46f2009bf8befce0c9ebfdbcfa7c08118baeca0f44d5072317a5b425b45074"} Apr 22 18:55:18.492571 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:18.492535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" event={"ID":"68d063f1-e405-47e8-9358-5c56110901a2","Type":"ContainerStarted","Data":"83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7"} Apr 22 18:55:18.492995 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:18.492586 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" Apr 22 18:55:18.507216 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:18.507015 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" podStartSLOduration=1.094026097 podStartE2EDuration="3.506996573s" podCreationTimestamp="2026-04-22 18:55:15 +0000 UTC" firstStartedPulling="2026-04-22 18:55:15.561487873 +0000 UTC m=+751.801120210" lastFinishedPulling="2026-04-22 18:55:17.974458335 +0000 UTC m=+754.214090686" observedRunningTime="2026-04-22 18:55:18.506431567 +0000 UTC m=+754.746063935" watchObservedRunningTime="2026-04-22 18:55:18.506996573 +0000 UTC m=+754.746628932" Apr 22 18:55:29.502047 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:29.502005 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" Apr 22 18:55:43.152388 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.152352 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5d44dbddb-f6vlm"] Apr 22 18:55:43.152862 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.152672 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" podUID="68d063f1-e405-47e8-9358-5c56110901a2" containerName="manager" containerID="cri-o://83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7" gracePeriod=10 Apr 22 18:55:43.387596 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.387576 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" Apr 22 18:55:43.413835 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.413765 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mcs9\" (UniqueName: \"kubernetes.io/projected/68d063f1-e405-47e8-9358-5c56110901a2-kube-api-access-7mcs9\") pod \"68d063f1-e405-47e8-9358-5c56110901a2\" (UID: \"68d063f1-e405-47e8-9358-5c56110901a2\") " Apr 22 18:55:43.416166 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.416128 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d063f1-e405-47e8-9358-5c56110901a2-kube-api-access-7mcs9" (OuterVolumeSpecName: "kube-api-access-7mcs9") pod "68d063f1-e405-47e8-9358-5c56110901a2" (UID: "68d063f1-e405-47e8-9358-5c56110901a2"). InnerVolumeSpecName "kube-api-access-7mcs9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:43.514525 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.514487 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7mcs9\" (UniqueName: \"kubernetes.io/projected/68d063f1-e405-47e8-9358-5c56110901a2-kube-api-access-7mcs9\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 18:55:43.577820 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.577783 2575 generic.go:358] "Generic (PLEG): container finished" podID="68d063f1-e405-47e8-9358-5c56110901a2" containerID="83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7" exitCode=0 Apr 22 18:55:43.577968 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.577854 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" Apr 22 18:55:43.577968 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.577874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" event={"ID":"68d063f1-e405-47e8-9358-5c56110901a2","Type":"ContainerDied","Data":"83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7"} Apr 22 18:55:43.577968 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.577912 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d44dbddb-f6vlm" event={"ID":"68d063f1-e405-47e8-9358-5c56110901a2","Type":"ContainerDied","Data":"fa46f2009bf8befce0c9ebfdbcfa7c08118baeca0f44d5072317a5b425b45074"} Apr 22 18:55:43.577968 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.577927 2575 scope.go:117] "RemoveContainer" containerID="83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7" Apr 22 18:55:43.586170 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.586151 2575 scope.go:117] "RemoveContainer" containerID="83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7" Apr 22 18:55:43.586417 ip-10-0-130-32 kubenswrapper[2575]: E0422 18:55:43.586398 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7\": container with ID starting with 83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7 not found: ID does not exist" containerID="83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7" Apr 22 18:55:43.586476 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.586429 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7"} err="failed to get container status \"83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7\": rpc error: code = NotFound desc = could not find container \"83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7\": container with ID starting with 83e1dbef1ab09c4d0a1788681fc1e186c6d3e871f7936d95fb33775833af8ea7 not found: ID does not exist" Apr 22 18:55:43.599551 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.599528 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5d44dbddb-f6vlm"] Apr 22 18:55:43.606826 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:43.606803 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5d44dbddb-f6vlm"] Apr 22 18:55:44.194513 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:55:44.194488 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d063f1-e405-47e8-9358-5c56110901a2" path="/var/lib/kubelet/pods/68d063f1-e405-47e8-9358-5c56110901a2/volumes" Apr 22 18:56:00.426011 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.425929 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp"] Apr 22 18:56:00.426414 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.426275 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68d063f1-e405-47e8-9358-5c56110901a2" containerName="manager" Apr 22 18:56:00.426414 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.426287 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d063f1-e405-47e8-9358-5c56110901a2" containerName="manager" Apr 22 18:56:00.426414 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.426363 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="68d063f1-e405-47e8-9358-5c56110901a2" containerName="manager" Apr 22 18:56:00.429648 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.429628 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.432934 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.432910 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 18:56:00.433042 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.432910 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 18:56:00.433042 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.432920 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 22 18:56:00.433042 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.432979 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-r6h6w\"" Apr 22 18:56:00.439250 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.439229 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp"] Apr 22 18:56:00.565401 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.565358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.565577 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.565422 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.565577 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.565457 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc5p8\" (UniqueName: \"kubernetes.io/projected/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-kube-api-access-zc5p8\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.565577 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.565477 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.565577 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.565514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.565577 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.565547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.666289 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.666253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.666490 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.666306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zc5p8\" (UniqueName: \"kubernetes.io/projected/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-kube-api-access-zc5p8\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.666490 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.666342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.666490 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.666380 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.666490 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.666422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.666490 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.666477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.666754 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.666730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.666837 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.666812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.666876 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.666819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.668816 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.668797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.668972 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.668956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.675094 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.675070 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc5p8\" (UniqueName: \"kubernetes.io/projected/782ef3df-ab21-461b-b8e3-cf4f2a87e46f-kube-api-access-zc5p8\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-g78qp\" (UID: \"782ef3df-ab21-461b-b8e3-cf4f2a87e46f\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.739861 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.739832 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:00.864671 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:00.864650 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp"] Apr 22 18:56:00.867053 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:56:00.867020 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod782ef3df_ab21_461b_b8e3_cf4f2a87e46f.slice/crio-874a06ac6171f6b6d439be53800a41580e7551a46732373cdd7147827a0d9c56 WatchSource:0}: Error finding container 874a06ac6171f6b6d439be53800a41580e7551a46732373cdd7147827a0d9c56: Status 404 returned error can't find the container with id 874a06ac6171f6b6d439be53800a41580e7551a46732373cdd7147827a0d9c56 Apr 22 18:56:01.637008 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:01.636973 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" event={"ID":"782ef3df-ab21-461b-b8e3-cf4f2a87e46f","Type":"ContainerStarted","Data":"874a06ac6171f6b6d439be53800a41580e7551a46732373cdd7147827a0d9c56"} Apr 22 18:56:07.665344 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:07.665307 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" event={"ID":"782ef3df-ab21-461b-b8e3-cf4f2a87e46f","Type":"ContainerStarted","Data":"92dfef94eecfc8f26db9855eb8b03bc65516aeaaf6fff8b77f9bf5e8fe1f388a"} Apr 22 18:56:16.696241 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:16.696206 2575 generic.go:358] "Generic (PLEG): container finished" podID="782ef3df-ab21-461b-b8e3-cf4f2a87e46f" containerID="92dfef94eecfc8f26db9855eb8b03bc65516aeaaf6fff8b77f9bf5e8fe1f388a" exitCode=0 Apr 22 18:56:16.696666 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:16.696251 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" event={"ID":"782ef3df-ab21-461b-b8e3-cf4f2a87e46f","Type":"ContainerDied","Data":"92dfef94eecfc8f26db9855eb8b03bc65516aeaaf6fff8b77f9bf5e8fe1f388a"} Apr 22 18:56:18.710247 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:18.710198 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" event={"ID":"782ef3df-ab21-461b-b8e3-cf4f2a87e46f","Type":"ContainerStarted","Data":"4287aa6e212f23b21e8da689eb0460fb6abf78514821610041aa389088b403b4"} Apr 22 18:56:18.710655 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:18.710534 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:18.729426 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:18.729362 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" podStartSLOduration=1.807255759 podStartE2EDuration="18.729346209s" podCreationTimestamp="2026-04-22 18:56:00 +0000 UTC" firstStartedPulling="2026-04-22 18:56:00.868940522 +0000 UTC m=+797.108572859" lastFinishedPulling="2026-04-22 18:56:17.791030969 +0000 UTC m=+814.030663309" observedRunningTime="2026-04-22 18:56:18.72805968 +0000 UTC m=+814.967692051" watchObservedRunningTime="2026-04-22 18:56:18.729346209 +0000 UTC m=+814.968978570" Apr 22 18:56:29.727172 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:29.727142 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-g78qp" Apr 22 18:56:33.832142 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:33.832109 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg"] Apr 22 18:56:33.965167 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:33.965131 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg"] Apr 22 18:56:33.965334 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:33.965253 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:33.967706 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:33.967685 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 22 18:56:34.035726 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.035697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.035726 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.035728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.035970 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.035748 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.035970 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.035809 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.035970 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.035849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7067b83c-5305-41a3-8524-c1b5b42a8026-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.035970 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.035869 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9xkk\" (UniqueName: \"kubernetes.io/projected/7067b83c-5305-41a3-8524-c1b5b42a8026-kube-api-access-r9xkk\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.136305 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.136238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7067b83c-5305-41a3-8524-c1b5b42a8026-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.136305 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.136275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9xkk\" (UniqueName: \"kubernetes.io/projected/7067b83c-5305-41a3-8524-c1b5b42a8026-kube-api-access-r9xkk\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.136305 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.136306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.136517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.136417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.136517 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.136465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.136617 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.136525 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.136673 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.136624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.136852 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.136831 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.136930 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.136893 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.138627 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.138607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7067b83c-5305-41a3-8524-c1b5b42a8026-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.138745 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.138724 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7067b83c-5305-41a3-8524-c1b5b42a8026-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.143324 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.143301 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9xkk\" (UniqueName: \"kubernetes.io/projected/7067b83c-5305-41a3-8524-c1b5b42a8026-kube-api-access-r9xkk\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xmvlg\" (UID: \"7067b83c-5305-41a3-8524-c1b5b42a8026\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.275305 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.275277 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:34.394485 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.394417 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg"] Apr 22 18:56:34.397867 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:56:34.397837 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7067b83c_5305_41a3_8524_c1b5b42a8026.slice/crio-45c63c87cde565747c5bbd28a8584053317a8a79b3684f3fe8afad1613f730d7 WatchSource:0}: Error finding container 45c63c87cde565747c5bbd28a8584053317a8a79b3684f3fe8afad1613f730d7: Status 404 returned error can't find the container with id 45c63c87cde565747c5bbd28a8584053317a8a79b3684f3fe8afad1613f730d7 Apr 22 18:56:34.765345 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.765309 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" event={"ID":"7067b83c-5305-41a3-8524-c1b5b42a8026","Type":"ContainerStarted","Data":"506a32ba637ce21d56e2bf2440f518a6e87de72f05818422c62ab1cc02341ccb"} Apr 22 18:56:34.765345 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:34.765346 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" event={"ID":"7067b83c-5305-41a3-8524-c1b5b42a8026","Type":"ContainerStarted","Data":"45c63c87cde565747c5bbd28a8584053317a8a79b3684f3fe8afad1613f730d7"} Apr 22 18:56:36.318240 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.318195 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc"] Apr 22 18:56:36.320377 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.320353 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.323029 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.323004 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 22 18:56:36.332205 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.332183 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc"] Apr 22 18:56:36.358015 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.357971 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.358173 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.358026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.358173 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.358094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4pxl\" (UniqueName: \"kubernetes.io/projected/9b8ba212-e859-44de-b152-580da1af3aef-kube-api-access-v4pxl\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.358173 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.358156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.358337 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.358190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9b8ba212-e859-44de-b152-580da1af3aef-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.358337 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.358270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.458988 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.458952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.458988 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.458989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9b8ba212-e859-44de-b152-580da1af3aef-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.459226 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.459025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.459226 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.459056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.459226 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.459080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.459226 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.459115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4pxl\" (UniqueName: \"kubernetes.io/projected/9b8ba212-e859-44de-b152-580da1af3aef-kube-api-access-v4pxl\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.459543 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.459520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.459634 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.459561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.459634 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.459583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.461509 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.461490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9b8ba212-e859-44de-b152-580da1af3aef-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.461638 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.461624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9b8ba212-e859-44de-b152-580da1af3aef-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.468727 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.468703 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4pxl\" (UniqueName: \"kubernetes.io/projected/9b8ba212-e859-44de-b152-580da1af3aef-kube-api-access-v4pxl\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc\" (UID: \"9b8ba212-e859-44de-b152-580da1af3aef\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.632506 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.632423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:36.760708 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.760664 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc"] Apr 22 18:56:36.762796 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:56:36.762746 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8ba212_e859_44de_b152_580da1af3aef.slice/crio-d06161be972cd86bf456fbb2cb3eff523847678baf4f63507368a189422c2c7f WatchSource:0}: Error finding container d06161be972cd86bf456fbb2cb3eff523847678baf4f63507368a189422c2c7f: Status 404 returned error can't find the container with id d06161be972cd86bf456fbb2cb3eff523847678baf4f63507368a189422c2c7f Apr 22 18:56:36.773003 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:36.772975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" event={"ID":"9b8ba212-e859-44de-b152-580da1af3aef","Type":"ContainerStarted","Data":"d06161be972cd86bf456fbb2cb3eff523847678baf4f63507368a189422c2c7f"} Apr 22 18:56:37.778325 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:37.778288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" event={"ID":"9b8ba212-e859-44de-b152-580da1af3aef","Type":"ContainerStarted","Data":"f7c81b0e31759d488316f8ccd08b053dc78388149d4514910b1a7d7f64dac39a"} Apr 22 18:56:39.786865 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:39.786825 2575 generic.go:358] "Generic (PLEG): container finished" podID="7067b83c-5305-41a3-8524-c1b5b42a8026" containerID="506a32ba637ce21d56e2bf2440f518a6e87de72f05818422c62ab1cc02341ccb" exitCode=0 Apr 22 18:56:39.787276 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:39.786862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" event={"ID":"7067b83c-5305-41a3-8524-c1b5b42a8026","Type":"ContainerDied","Data":"506a32ba637ce21d56e2bf2440f518a6e87de72f05818422c62ab1cc02341ccb"} Apr 22 18:56:40.508197 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.508157 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b"] Apr 22 18:56:40.510640 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.510616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.513361 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.513339 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 22 18:56:40.519782 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.519734 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b"] Apr 22 18:56:40.597461 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.597431 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.597635 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.597483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.597635 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.597591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.597722 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.597660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.597722 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.597695 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.597809 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.597726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgh9t\" (UniqueName: \"kubernetes.io/projected/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-kube-api-access-vgh9t\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.698659 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.698621 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.698881 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.698691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.698881 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.698715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.698881 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.698739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgh9t\" (UniqueName: \"kubernetes.io/projected/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-kube-api-access-vgh9t\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.698881 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.698797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.698881 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.698866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.699133 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.699015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.699133 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.699084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.699133 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.699123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.701102 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.701083 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.701245 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.701227 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.706069 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.706049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgh9t\" (UniqueName: \"kubernetes.io/projected/e62a65d3-9e6d-4a16-acc6-f1551bbcc181-kube-api-access-vgh9t\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b\" (UID: \"e62a65d3-9e6d-4a16-acc6-f1551bbcc181\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.792032 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.791935 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" event={"ID":"7067b83c-5305-41a3-8524-c1b5b42a8026","Type":"ContainerStarted","Data":"9ebfb365e24a9a85563301be6e281eac5cbbbf1489387332724b8baf378ba911"} Apr 22 18:56:40.792433 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.792188 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:40.809015 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.808834 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" podStartSLOduration=7.48964833 podStartE2EDuration="7.808818512s" podCreationTimestamp="2026-04-22 18:56:33 +0000 UTC" firstStartedPulling="2026-04-22 18:56:39.787598633 +0000 UTC m=+836.027230970" lastFinishedPulling="2026-04-22 18:56:40.106768811 +0000 UTC m=+836.346401152" observedRunningTime="2026-04-22 18:56:40.80836533 +0000 UTC m=+837.047997717" watchObservedRunningTime="2026-04-22 18:56:40.808818512 +0000 UTC m=+837.048450872" Apr 22 18:56:40.823394 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.823370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:40.951738 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:40.951686 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b"] Apr 22 18:56:40.953804 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:56:40.953753 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode62a65d3_9e6d_4a16_acc6_f1551bbcc181.slice/crio-cdac29fce20faeaced6f254e5b440b3f2feb6c3dfd37418fc697110e42516f74 WatchSource:0}: Error finding container cdac29fce20faeaced6f254e5b440b3f2feb6c3dfd37418fc697110e42516f74: Status 404 returned error can't find the container with id cdac29fce20faeaced6f254e5b440b3f2feb6c3dfd37418fc697110e42516f74 Apr 22 18:56:41.797070 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:41.797026 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" event={"ID":"e62a65d3-9e6d-4a16-acc6-f1551bbcc181","Type":"ContainerStarted","Data":"9b3766f3cdb54ea2641f6641b047eec689cae46608cea03b8a78a5d274dc515c"} Apr 22 18:56:41.797473 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:41.797079 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" event={"ID":"e62a65d3-9e6d-4a16-acc6-f1551bbcc181","Type":"ContainerStarted","Data":"cdac29fce20faeaced6f254e5b440b3f2feb6c3dfd37418fc697110e42516f74"} Apr 22 18:56:42.801633 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:42.801594 2575 generic.go:358] "Generic (PLEG): container finished" podID="9b8ba212-e859-44de-b152-580da1af3aef" containerID="f7c81b0e31759d488316f8ccd08b053dc78388149d4514910b1a7d7f64dac39a" exitCode=0 Apr 22 18:56:42.802083 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:42.801673 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" event={"ID":"9b8ba212-e859-44de-b152-580da1af3aef","Type":"ContainerDied","Data":"f7c81b0e31759d488316f8ccd08b053dc78388149d4514910b1a7d7f64dac39a"} Apr 22 18:56:43.806889 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:43.806853 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" event={"ID":"9b8ba212-e859-44de-b152-580da1af3aef","Type":"ContainerStarted","Data":"8571a994d3faa223221bc345b9b199bcaf0b03e0a09d7caef3a2fdaf6595922c"} Apr 22 18:56:43.807239 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:43.807079 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:43.826866 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:43.826808 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" podStartSLOduration=7.556964171 podStartE2EDuration="7.826793205s" podCreationTimestamp="2026-04-22 18:56:36 +0000 UTC" firstStartedPulling="2026-04-22 18:56:42.802365977 +0000 UTC m=+839.041998314" lastFinishedPulling="2026-04-22 18:56:43.072194997 +0000 UTC m=+839.311827348" observedRunningTime="2026-04-22 18:56:43.825614382 +0000 UTC m=+840.065246748" watchObservedRunningTime="2026-04-22 18:56:43.826793205 +0000 UTC m=+840.066425568" Apr 22 18:56:46.817481 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:46.817395 2575 generic.go:358] "Generic (PLEG): container finished" podID="e62a65d3-9e6d-4a16-acc6-f1551bbcc181" containerID="9b3766f3cdb54ea2641f6641b047eec689cae46608cea03b8a78a5d274dc515c" exitCode=0 Apr 22 18:56:46.817481 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:46.817467 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" event={"ID":"e62a65d3-9e6d-4a16-acc6-f1551bbcc181","Type":"ContainerDied","Data":"9b3766f3cdb54ea2641f6641b047eec689cae46608cea03b8a78a5d274dc515c"} Apr 22 18:56:47.822553 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:47.822517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" event={"ID":"e62a65d3-9e6d-4a16-acc6-f1551bbcc181","Type":"ContainerStarted","Data":"813d45eee11be3f99c98a4498e22fe33c433021a847fa972c1988335272da19b"} Apr 22 18:56:47.823084 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:47.822750 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:56:47.842225 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:47.842177 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" podStartSLOduration=7.634743739 podStartE2EDuration="7.842164588s" podCreationTimestamp="2026-04-22 18:56:40 +0000 UTC" firstStartedPulling="2026-04-22 18:56:46.818081764 +0000 UTC m=+843.057714102" lastFinishedPulling="2026-04-22 18:56:47.025502614 +0000 UTC m=+843.265134951" observedRunningTime="2026-04-22 18:56:47.839347448 +0000 UTC m=+844.078979808" watchObservedRunningTime="2026-04-22 18:56:47.842164588 +0000 UTC m=+844.081796947" Apr 22 18:56:48.623127 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.623096 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr"] Apr 22 18:56:48.625369 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.625349 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.627668 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.627652 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 22 18:56:48.636066 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.636031 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr"] Apr 22 18:56:48.670754 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.670725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.670754 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.670756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.670925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.670819 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.670925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.670885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8826k\" (UniqueName: \"kubernetes.io/projected/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-kube-api-access-8826k\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.670925 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.670909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.671019 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.670939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.771544 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.771510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.771686 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.771588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.771686 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.771612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.771686 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.771641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.771686 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.771669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8826k\" (UniqueName: \"kubernetes.io/projected/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-kube-api-access-8826k\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.771920 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.771702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.772105 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.772079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.772194 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.772115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.772194 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.772177 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.773880 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.773861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.774266 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.774248 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.782040 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.782019 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8826k\" (UniqueName: \"kubernetes.io/projected/8dd278cf-7fd4-4ce4-b68b-93fb82a0036e-kube-api-access-8826k\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr\" (UID: \"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:48.935655 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:48.935573 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:49.261229 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:49.261080 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr"] Apr 22 18:56:49.264699 ip-10-0-130-32 kubenswrapper[2575]: W0422 18:56:49.264666 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dd278cf_7fd4_4ce4_b68b_93fb82a0036e.slice/crio-6abe0e6c293495ef07035365cf61d87d5c1993a0dd7a1736eb2ae5687a2be0c4 WatchSource:0}: Error finding container 6abe0e6c293495ef07035365cf61d87d5c1993a0dd7a1736eb2ae5687a2be0c4: Status 404 returned error can't find the container with id 6abe0e6c293495ef07035365cf61d87d5c1993a0dd7a1736eb2ae5687a2be0c4 Apr 22 18:56:49.831354 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:49.831317 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" event={"ID":"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e","Type":"ContainerStarted","Data":"775c5fed6c51f6ede1ca60cce390513114315371e59d3944f52b7c68735b9c9b"} Apr 22 18:56:49.831555 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:49.831360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" event={"ID":"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e","Type":"ContainerStarted","Data":"6abe0e6c293495ef07035365cf61d87d5c1993a0dd7a1736eb2ae5687a2be0c4"} Apr 22 18:56:51.810489 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:51.810459 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xmvlg" Apr 22 18:56:54.823252 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:54.823174 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc" Apr 22 18:56:54.852754 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:54.852726 2575 generic.go:358] "Generic (PLEG): container finished" podID="8dd278cf-7fd4-4ce4-b68b-93fb82a0036e" containerID="775c5fed6c51f6ede1ca60cce390513114315371e59d3944f52b7c68735b9c9b" exitCode=0 Apr 22 18:56:54.852920 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:54.852810 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" event={"ID":"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e","Type":"ContainerDied","Data":"775c5fed6c51f6ede1ca60cce390513114315371e59d3944f52b7c68735b9c9b"} Apr 22 18:56:55.859121 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:55.859087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" event={"ID":"8dd278cf-7fd4-4ce4-b68b-93fb82a0036e","Type":"ContainerStarted","Data":"7a4c29457f42c403df846be1bdf3f78b8254e4262edd75cf3c9f82bcb4abfc12"} Apr 22 18:56:55.859543 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:55.859320 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:56:55.878996 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:55.878950 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" podStartSLOduration=7.696833154 podStartE2EDuration="7.878935558s" podCreationTimestamp="2026-04-22 18:56:48 +0000 UTC" firstStartedPulling="2026-04-22 18:56:54.853425812 +0000 UTC m=+851.093058150" lastFinishedPulling="2026-04-22 18:56:55.035528205 +0000 UTC m=+851.275160554" observedRunningTime="2026-04-22 18:56:55.876632163 +0000 UTC m=+852.116264522" watchObservedRunningTime="2026-04-22 18:56:55.878935558 +0000 UTC m=+852.118567917" Apr 22 18:56:58.839526 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:56:58.839492 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b" Apr 22 18:57:06.874698 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:57:06.874666 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr" Apr 22 18:57:44.146586 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:57:44.146503 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 18:57:44.149452 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:57:44.149429 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 18:57:44.153017 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:57:44.152996 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 18:57:44.156471 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:57:44.156452 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 18:59:44.800242 ip-10-0-130-32 kubenswrapper[2575]: I0422 18:59:44.800199 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-32.ec2.internal" podUID="150594baeab35cb17ccfa66548a34222" containerName="haproxy" probeResult="failure" output="Get \"https://172.20.0.1:6443/version\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 22 19:02:44.170564 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:02:44.170534 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 19:02:44.176122 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:02:44.176103 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 19:02:44.176249 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:02:44.176144 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 19:02:44.181699 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:02:44.181683 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 19:07:44.195915 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:07:44.195885 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 19:07:44.200909 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:07:44.200886 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 19:07:44.201312 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:07:44.201297 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 19:07:44.205895 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:07:44.205879 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 19:09:22.548002 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:22.547955 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l"] Apr 22 19:09:22.550748 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:22.548234 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" podUID="16a312c7-51c8-4f45-94f1-7dea30519951" containerName="manager" containerID="cri-o://8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb" gracePeriod=10 Apr 22 19:09:22.786041 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:22.786018 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 19:09:22.965917 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:22.965892 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbgxn\" (UniqueName: \"kubernetes.io/projected/16a312c7-51c8-4f45-94f1-7dea30519951-kube-api-access-rbgxn\") pod \"16a312c7-51c8-4f45-94f1-7dea30519951\" (UID: \"16a312c7-51c8-4f45-94f1-7dea30519951\") " Apr 22 19:09:22.966067 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:22.965982 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/16a312c7-51c8-4f45-94f1-7dea30519951-extensions-socket-volume\") pod \"16a312c7-51c8-4f45-94f1-7dea30519951\" (UID: \"16a312c7-51c8-4f45-94f1-7dea30519951\") " Apr 22 19:09:22.966356 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:22.966328 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a312c7-51c8-4f45-94f1-7dea30519951-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "16a312c7-51c8-4f45-94f1-7dea30519951" (UID: "16a312c7-51c8-4f45-94f1-7dea30519951"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:22.967837 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:22.967814 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a312c7-51c8-4f45-94f1-7dea30519951-kube-api-access-rbgxn" (OuterVolumeSpecName: "kube-api-access-rbgxn") pod "16a312c7-51c8-4f45-94f1-7dea30519951" (UID: "16a312c7-51c8-4f45-94f1-7dea30519951"). InnerVolumeSpecName "kube-api-access-rbgxn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:09:23.067586 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:23.067515 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/16a312c7-51c8-4f45-94f1-7dea30519951-extensions-socket-volume\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 19:09:23.067586 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:23.067541 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbgxn\" (UniqueName: \"kubernetes.io/projected/16a312c7-51c8-4f45-94f1-7dea30519951-kube-api-access-rbgxn\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 19:09:23.424805 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:23.424696 2575 generic.go:358] "Generic (PLEG): container finished" podID="16a312c7-51c8-4f45-94f1-7dea30519951" containerID="8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb" exitCode=0 Apr 22 19:09:23.424805 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:23.424759 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" Apr 22 19:09:23.424805 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:23.424765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" event={"ID":"16a312c7-51c8-4f45-94f1-7dea30519951","Type":"ContainerDied","Data":"8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb"} Apr 22 19:09:23.425056 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:23.424814 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l" event={"ID":"16a312c7-51c8-4f45-94f1-7dea30519951","Type":"ContainerDied","Data":"456f4ffc72d899efc9bdc850d4eff203ab1806c8754fcf09d6c8927239c13763"} Apr 22 19:09:23.425056 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:23.424829 2575 scope.go:117] "RemoveContainer" containerID="8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb" Apr 22 19:09:23.433523 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:23.433502 2575 scope.go:117] "RemoveContainer" containerID="8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb" Apr 22 19:09:23.433758 ip-10-0-130-32 kubenswrapper[2575]: E0422 19:09:23.433740 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb\": container with ID starting with 8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb not found: ID does not exist" containerID="8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb" Apr 22 19:09:23.433843 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:23.433768 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb"} err="failed to get container status \"8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb\": rpc error: code = NotFound desc = could not find container \"8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb\": container with ID starting with 8ef494f64465286d295b38d98587640d41c39afe487ee9b28a9af4cdce75beeb not found: ID does not exist" Apr 22 19:09:23.448219 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:23.448165 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l"] Apr 22 19:09:23.450218 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:23.450195 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7wl6l"] Apr 22 19:09:24.194180 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:09:24.194147 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a312c7-51c8-4f45-94f1-7dea30519951" path="/var/lib/kubelet/pods/16a312c7-51c8-4f45-94f1-7dea30519951/volumes" Apr 22 19:12:44.219266 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:12:44.219214 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 19:12:44.224524 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:12:44.224504 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 19:12:44.224642 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:12:44.224627 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 19:12:44.229740 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:12:44.229724 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 19:15:00.147200 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.147167 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29614755-2sh9d"] Apr 22 19:15:00.149858 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.147499 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16a312c7-51c8-4f45-94f1-7dea30519951" containerName="manager" Apr 22 19:15:00.149858 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.147509 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a312c7-51c8-4f45-94f1-7dea30519951" containerName="manager" Apr 22 19:15:00.149858 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.147589 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="16a312c7-51c8-4f45-94f1-7dea30519951" containerName="manager" Apr 22 19:15:00.150886 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.150864 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" Apr 22 19:15:00.154931 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.154912 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-4fsbs\"" Apr 22 19:15:00.230517 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.230483 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614755-2sh9d"] Apr 22 19:15:00.282220 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.282187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsrnh\" (UniqueName: \"kubernetes.io/projected/2c7abbdc-132a-4446-bef1-63f1a368b70f-kube-api-access-gsrnh\") pod \"maas-api-key-cleanup-29614755-2sh9d\" (UID: \"2c7abbdc-132a-4446-bef1-63f1a368b70f\") " pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" Apr 22 19:15:00.383386 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.383351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsrnh\" (UniqueName: \"kubernetes.io/projected/2c7abbdc-132a-4446-bef1-63f1a368b70f-kube-api-access-gsrnh\") pod \"maas-api-key-cleanup-29614755-2sh9d\" (UID: \"2c7abbdc-132a-4446-bef1-63f1a368b70f\") " pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" Apr 22 19:15:00.391910 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.391882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsrnh\" (UniqueName: \"kubernetes.io/projected/2c7abbdc-132a-4446-bef1-63f1a368b70f-kube-api-access-gsrnh\") pod \"maas-api-key-cleanup-29614755-2sh9d\" (UID: \"2c7abbdc-132a-4446-bef1-63f1a368b70f\") " pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" Apr 22 19:15:00.461284 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.461258 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" Apr 22 19:15:00.792861 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.792839 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614755-2sh9d"] Apr 22 19:15:00.794861 ip-10-0-130-32 kubenswrapper[2575]: W0422 19:15:00.794825 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c7abbdc_132a_4446_bef1_63f1a368b70f.slice/crio-cbe67ca4af5136c88dd0f80882eae7e1dd95147ba42320f6aa766660f60f4349 WatchSource:0}: Error finding container cbe67ca4af5136c88dd0f80882eae7e1dd95147ba42320f6aa766660f60f4349: Status 404 returned error can't find the container with id cbe67ca4af5136c88dd0f80882eae7e1dd95147ba42320f6aa766660f60f4349 Apr 22 19:15:00.796648 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:00.796630 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:15:01.570078 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:01.570041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" event={"ID":"2c7abbdc-132a-4446-bef1-63f1a368b70f","Type":"ContainerStarted","Data":"cbe67ca4af5136c88dd0f80882eae7e1dd95147ba42320f6aa766660f60f4349"} Apr 22 19:15:02.574549 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:02.574503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" event={"ID":"2c7abbdc-132a-4446-bef1-63f1a368b70f","Type":"ContainerStarted","Data":"bd9bb9fb85a08dcf960bdf4544322d0f8fe3958b117440cac32fe8e1d47e96ad"} Apr 22 19:15:02.593072 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:02.593029 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" podStartSLOduration=1.761990068 podStartE2EDuration="2.593015247s" podCreationTimestamp="2026-04-22 19:15:00 +0000 UTC" firstStartedPulling="2026-04-22 19:15:00.796837463 +0000 UTC m=+1937.036469814" lastFinishedPulling="2026-04-22 19:15:01.627862653 +0000 UTC m=+1937.867494993" observedRunningTime="2026-04-22 19:15:02.592567617 +0000 UTC m=+1938.832199977" watchObservedRunningTime="2026-04-22 19:15:02.593015247 +0000 UTC m=+1938.832647605" Apr 22 19:15:22.639587 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:22.639499 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerID="bd9bb9fb85a08dcf960bdf4544322d0f8fe3958b117440cac32fe8e1d47e96ad" exitCode=6 Apr 22 19:15:22.639587 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:22.639576 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" event={"ID":"2c7abbdc-132a-4446-bef1-63f1a368b70f","Type":"ContainerDied","Data":"bd9bb9fb85a08dcf960bdf4544322d0f8fe3958b117440cac32fe8e1d47e96ad"} Apr 22 19:15:22.640003 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:22.639891 2575 scope.go:117] "RemoveContainer" containerID="bd9bb9fb85a08dcf960bdf4544322d0f8fe3958b117440cac32fe8e1d47e96ad" Apr 22 19:15:23.644928 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:23.644847 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" event={"ID":"2c7abbdc-132a-4446-bef1-63f1a368b70f","Type":"ContainerStarted","Data":"0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442"} Apr 22 19:15:43.715923 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:43.715884 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerID="0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442" exitCode=6 Apr 22 19:15:43.716347 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:43.715962 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" event={"ID":"2c7abbdc-132a-4446-bef1-63f1a368b70f","Type":"ContainerDied","Data":"0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442"} Apr 22 19:15:43.716347 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:43.716008 2575 scope.go:117] "RemoveContainer" containerID="bd9bb9fb85a08dcf960bdf4544322d0f8fe3958b117440cac32fe8e1d47e96ad" Apr 22 19:15:43.716347 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:43.716270 2575 scope.go:117] "RemoveContainer" containerID="0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442" Apr 22 19:15:43.716500 ip-10-0-130-32 kubenswrapper[2575]: E0422 19:15:43.716481 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29614755-2sh9d_opendatahub(2c7abbdc-132a-4446-bef1-63f1a368b70f)\"" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" podUID="2c7abbdc-132a-4446-bef1-63f1a368b70f" Apr 22 19:15:58.190281 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:58.190246 2575 scope.go:117] "RemoveContainer" containerID="0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442" Apr 22 19:15:58.767540 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:58.767507 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" event={"ID":"2c7abbdc-132a-4446-bef1-63f1a368b70f","Type":"ContainerStarted","Data":"987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822"} Apr 22 19:15:59.217558 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:59.217523 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614755-2sh9d"] Apr 22 19:15:59.772106 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:15:59.772050 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" podUID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerName="cleanup" containerID="cri-o://987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822" gracePeriod=30 Apr 22 19:16:19.112419 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.112395 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" Apr 22 19:16:19.186037 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.185960 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsrnh\" (UniqueName: \"kubernetes.io/projected/2c7abbdc-132a-4446-bef1-63f1a368b70f-kube-api-access-gsrnh\") pod \"2c7abbdc-132a-4446-bef1-63f1a368b70f\" (UID: \"2c7abbdc-132a-4446-bef1-63f1a368b70f\") " Apr 22 19:16:19.187901 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.187872 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c7abbdc-132a-4446-bef1-63f1a368b70f-kube-api-access-gsrnh" (OuterVolumeSpecName: "kube-api-access-gsrnh") pod "2c7abbdc-132a-4446-bef1-63f1a368b70f" (UID: "2c7abbdc-132a-4446-bef1-63f1a368b70f"). InnerVolumeSpecName "kube-api-access-gsrnh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:16:19.287428 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.287401 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gsrnh\" (UniqueName: \"kubernetes.io/projected/2c7abbdc-132a-4446-bef1-63f1a368b70f-kube-api-access-gsrnh\") on node \"ip-10-0-130-32.ec2.internal\" DevicePath \"\"" Apr 22 19:16:19.841298 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.841255 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerID="987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822" exitCode=6 Apr 22 19:16:19.841482 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.841326 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" Apr 22 19:16:19.841482 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.841336 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" event={"ID":"2c7abbdc-132a-4446-bef1-63f1a368b70f","Type":"ContainerDied","Data":"987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822"} Apr 22 19:16:19.841482 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.841379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614755-2sh9d" event={"ID":"2c7abbdc-132a-4446-bef1-63f1a368b70f","Type":"ContainerDied","Data":"cbe67ca4af5136c88dd0f80882eae7e1dd95147ba42320f6aa766660f60f4349"} Apr 22 19:16:19.841482 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.841395 2575 scope.go:117] "RemoveContainer" containerID="987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822" Apr 22 19:16:19.851708 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.851684 2575 scope.go:117] "RemoveContainer" containerID="0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442" Apr 22 19:16:19.859110 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.859086 2575 scope.go:117] "RemoveContainer" containerID="987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822" Apr 22 19:16:19.859382 ip-10-0-130-32 kubenswrapper[2575]: E0422 19:16:19.859364 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822\": container with ID starting with 987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822 not found: ID does not exist" containerID="987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822" Apr 22 19:16:19.859443 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.859391 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822"} err="failed to get container status \"987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822\": rpc error: code = NotFound desc = could not find container \"987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822\": container with ID starting with 987fec2d6b847310b17b8b0d63fc08156d820b269f84b03d2818136a6f877822 not found: ID does not exist" Apr 22 19:16:19.859443 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.859411 2575 scope.go:117] "RemoveContainer" containerID="0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442" Apr 22 19:16:19.859623 ip-10-0-130-32 kubenswrapper[2575]: E0422 19:16:19.859607 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442\": container with ID starting with 0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442 not found: ID does not exist" containerID="0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442" Apr 22 19:16:19.859665 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.859628 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442"} err="failed to get container status \"0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442\": rpc error: code = NotFound desc = could not find container \"0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442\": container with ID starting with 0da1bf5f3ac761220392c15ddeeb19e7793c57f6f4a2e4a7b1c040448ec35442 not found: ID does not exist" Apr 22 19:16:19.865418 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.865393 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614755-2sh9d"] Apr 22 19:16:19.869292 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:19.869271 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614755-2sh9d"] Apr 22 19:16:20.194014 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:16:20.193937 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c7abbdc-132a-4446-bef1-63f1a368b70f" path="/var/lib/kubelet/pods/2c7abbdc-132a-4446-bef1-63f1a368b70f/volumes" Apr 22 19:17:44.245575 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:17:44.245542 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 19:17:44.250671 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:17:44.250648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 19:17:44.251410 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:17:44.251386 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 19:17:44.256115 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:17:44.256100 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hwf7s_ea3f4bad-3513-4bfe-9cd3-e706b42dc86c/ovn-acl-logging/0.log" Apr 22 19:20:15.694401 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:15.694363 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-dd89cc56c-ddt59_32b7588b-a569-4c47-95a7-5ab772e8e085/manager/0.log" Apr 22 19:20:18.532019 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:18.531990 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7cff94f675-fwl9g_51f9655e-566c-4ec9-847b-ea96b2a6b6c1/kube-auth-proxy/0.log" Apr 22 19:20:18.864164 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:18.864085 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7cc989c66-cc7nk_d296e50b-a805-4e1b-9297-f74fb4549ed5/router/0.log" Apr 22 19:20:19.318988 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:19.318955 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc_9b8ba212-e859-44de-b152-580da1af3aef/storage-initializer/0.log" Apr 22 19:20:19.326787 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:19.326746 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-qnpzc_9b8ba212-e859-44de-b152-580da1af3aef/main/0.log" Apr 22 19:20:19.440292 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:19.440264 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-xmvlg_7067b83c-5305-41a3-8524-c1b5b42a8026/storage-initializer/0.log" Apr 22 19:20:19.447751 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:19.447727 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-xmvlg_7067b83c-5305-41a3-8524-c1b5b42a8026/main/0.log" Apr 22 19:20:19.560605 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:19.560578 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b_e62a65d3-9e6d-4a16-acc6-f1551bbcc181/storage-initializer/0.log" Apr 22 19:20:19.569186 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:19.569125 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9s46b_e62a65d3-9e6d-4a16-acc6-f1551bbcc181/main/0.log" Apr 22 19:20:19.678829 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:19.678800 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr_8dd278cf-7fd4-4ce4-b68b-93fb82a0036e/storage-initializer/0.log" Apr 22 19:20:19.685757 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:19.685733 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-bd9vr_8dd278cf-7fd4-4ce4-b68b-93fb82a0036e/main/0.log" Apr 22 19:20:19.801104 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:19.801077 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-g78qp_782ef3df-ab21-461b-b8e3-cf4f2a87e46f/storage-initializer/0.log" Apr 22 19:20:19.808337 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:19.808312 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-g78qp_782ef3df-ab21-461b-b8e3-cf4f2a87e46f/main/0.log" Apr 22 19:20:26.324409 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:26.324378 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-58wc6_f50f3098-de64-4067-a39e-e51151f6082a/global-pull-secret-syncer/0.log" Apr 22 19:20:26.627572 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:26.627484 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-985jk_ec13a05a-498a-4a5e-a065-7e57635aafff/konnectivity-agent/0.log" Apr 22 19:20:26.686633 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:26.686603 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-32.ec2.internal_150594baeab35cb17ccfa66548a34222/haproxy/0.log" Apr 22 19:20:33.309137 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.309062 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-p7ksl_309b4d32-03cc-43ef-b0f0-8f772378a81a/cluster-monitoring-operator/0.log" Apr 22 19:20:33.348528 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.348494 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f87fm_33f2ccbe-3de8-4405-9e87-201aa5a5b773/kube-state-metrics/0.log" Apr 22 19:20:33.387751 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.387718 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f87fm_33f2ccbe-3de8-4405-9e87-201aa5a5b773/kube-rbac-proxy-main/0.log" Apr 22 19:20:33.415580 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.415554 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f87fm_33f2ccbe-3de8-4405-9e87-201aa5a5b773/kube-rbac-proxy-self/0.log" Apr 22 19:20:33.696046 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.695968 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rk5lg_64727fd4-5eae-4fbd-ad64-ec2f5828bfbd/node-exporter/0.log" Apr 22 19:20:33.726646 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.726592 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rk5lg_64727fd4-5eae-4fbd-ad64-ec2f5828bfbd/kube-rbac-proxy/0.log" Apr 22 19:20:33.748370 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.748347 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rk5lg_64727fd4-5eae-4fbd-ad64-ec2f5828bfbd/init-textfile/0.log" Apr 22 19:20:33.854784 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.854733 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cf1faafb-76d6-42b7-bb43-29c6de68a436/prometheus/0.log" Apr 22 19:20:33.878578 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.878556 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cf1faafb-76d6-42b7-bb43-29c6de68a436/config-reloader/0.log" Apr 22 19:20:33.907215 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.907193 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cf1faafb-76d6-42b7-bb43-29c6de68a436/thanos-sidecar/0.log" Apr 22 19:20:33.928872 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.928846 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cf1faafb-76d6-42b7-bb43-29c6de68a436/kube-rbac-proxy-web/0.log" Apr 22 19:20:33.953416 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.953395 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cf1faafb-76d6-42b7-bb43-29c6de68a436/kube-rbac-proxy/0.log" Apr 22 19:20:33.975032 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.975009 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cf1faafb-76d6-42b7-bb43-29c6de68a436/kube-rbac-proxy-thanos/0.log" Apr 22 19:20:33.995652 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:33.995634 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cf1faafb-76d6-42b7-bb43-29c6de68a436/init-config-reloader/0.log" Apr 22 19:20:34.024073 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.024054 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kl76v_ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b/prometheus-operator/0.log" Apr 22 19:20:34.049694 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.049673 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kl76v_ff4bc19f-0abc-4fdf-86c4-bfe2fc933d7b/kube-rbac-proxy/0.log" Apr 22 19:20:34.074395 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.074369 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-6zp84_fa4aef80-7b9b-4870-b6c3-3c67622e5979/prometheus-operator-admission-webhook/0.log" Apr 22 19:20:34.592665 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.592629 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms"] Apr 22 19:20:34.593036 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.592974 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerName="cleanup" Apr 22 19:20:34.593036 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.592984 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerName="cleanup" Apr 22 19:20:34.593036 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.592993 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerName="cleanup" Apr 22 19:20:34.593036 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.592999 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerName="cleanup" Apr 22 19:20:34.593036 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.593015 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerName="cleanup" Apr 22 19:20:34.593036 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.593020 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerName="cleanup" Apr 22 19:20:34.593216 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.593081 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerName="cleanup" Apr 22 19:20:34.593216 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.593092 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c7abbdc-132a-4446-bef1-63f1a368b70f" containerName="cleanup" Apr 22 19:20:34.596323 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.596301 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.598882 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.598865 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tz8b7\"/\"openshift-service-ca.crt\"" Apr 22 19:20:34.598986 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.598910 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tz8b7\"/\"default-dockercfg-7p5tx\"" Apr 22 19:20:34.599046 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.599011 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tz8b7\"/\"kube-root-ca.crt\"" Apr 22 19:20:34.604396 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.604377 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms"] Apr 22 19:20:34.700097 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.700062 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-podres\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.700269 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.700103 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-proc\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.700269 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.700163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5bfg\" (UniqueName: \"kubernetes.io/projected/1c7f3188-a560-4028-a375-6156fee52ef0-kube-api-access-x5bfg\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.700269 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.700211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-lib-modules\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.700398 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.700299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-sys\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.801656 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.801626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-sys\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.801843 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.801679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-podres\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.801843 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.801698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-proc\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.801843 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.801721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5bfg\" (UniqueName: \"kubernetes.io/projected/1c7f3188-a560-4028-a375-6156fee52ef0-kube-api-access-x5bfg\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.801843 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.801748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-lib-modules\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.801843 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.801761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-sys\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.802023 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.801840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-podres\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.802023 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.801840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-proc\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.802023 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.801879 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1c7f3188-a560-4028-a375-6156fee52ef0-lib-modules\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.809817 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.809798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5bfg\" (UniqueName: \"kubernetes.io/projected/1c7f3188-a560-4028-a375-6156fee52ef0-kube-api-access-x5bfg\") pod \"perf-node-gather-daemonset-crlms\" (UID: \"1c7f3188-a560-4028-a375-6156fee52ef0\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:34.906112 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:34.906031 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:35.031866 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:35.031842 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms"] Apr 22 19:20:35.034435 ip-10-0-130-32 kubenswrapper[2575]: W0422 19:20:35.034406 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1c7f3188_a560_4028_a375_6156fee52ef0.slice/crio-8d16a38beb843f79d0b44b733c01352c3d239d8a8085cf5ecc67221fb97f2f46 WatchSource:0}: Error finding container 8d16a38beb843f79d0b44b733c01352c3d239d8a8085cf5ecc67221fb97f2f46: Status 404 returned error can't find the container with id 8d16a38beb843f79d0b44b733c01352c3d239d8a8085cf5ecc67221fb97f2f46 Apr 22 19:20:35.035964 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:35.035940 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:20:35.691273 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:35.691236 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" event={"ID":"1c7f3188-a560-4028-a375-6156fee52ef0","Type":"ContainerStarted","Data":"1ab66f6d390a0846c8e8ddc6bb5dc6bc8e3d251222777be241a3bf0af7c51a54"} Apr 22 19:20:35.691273 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:35.691270 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" event={"ID":"1c7f3188-a560-4028-a375-6156fee52ef0","Type":"ContainerStarted","Data":"8d16a38beb843f79d0b44b733c01352c3d239d8a8085cf5ecc67221fb97f2f46"} Apr 22 19:20:35.691768 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:35.691293 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:35.707825 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:35.707758 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" podStartSLOduration=1.707745334 podStartE2EDuration="1.707745334s" podCreationTimestamp="2026-04-22 19:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:20:35.706628284 +0000 UTC m=+2271.946260669" watchObservedRunningTime="2026-04-22 19:20:35.707745334 +0000 UTC m=+2271.947377692" Apr 22 19:20:35.840564 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:35.840532 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/2.log" Apr 22 19:20:35.847651 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:35.847610 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2r8wk_03550605-e0bb-4434-8e90-08b3aecc5a4c/console-operator/3.log" Apr 22 19:20:36.799809 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:36.799764 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-shk2x_119686fb-62fb-4304-be1e-ea9e264fd21d/volume-data-source-validator/0.log" Apr 22 19:20:37.581736 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:37.581707 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-grs9r_6922ad30-ba0a-4bf8-b384-cdf6a0514c3a/dns/0.log" Apr 22 19:20:37.603424 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:37.603391 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-grs9r_6922ad30-ba0a-4bf8-b384-cdf6a0514c3a/kube-rbac-proxy/0.log" Apr 22 19:20:37.735735 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:37.735708 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5jr6w_3bf65c2b-0944-4d58-bd8b-923617359ff3/dns-node-resolver/0.log" Apr 22 19:20:38.327835 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:38.327790 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h7ks7_cc6477a3-da8c-40f7-ae67-bf32ede541af/node-ca/0.log" Apr 22 19:20:39.293681 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:39.293649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7cff94f675-fwl9g_51f9655e-566c-4ec9-847b-ea96b2a6b6c1/kube-auth-proxy/0.log" Apr 22 19:20:39.444288 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:39.444232 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7cc989c66-cc7nk_d296e50b-a805-4e1b-9297-f74fb4549ed5/router/0.log" Apr 22 19:20:40.022496 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:40.022468 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-sn54r_dcba4051-c58c-4ba8-baba-853741840882/serve-healthcheck-canary/0.log" Apr 22 19:20:40.665012 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:40.664962 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zqnzm_37f17e4f-8487-4af2-b0cb-77595be064c5/kube-rbac-proxy/0.log" Apr 22 19:20:40.687455 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:40.687424 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zqnzm_37f17e4f-8487-4af2-b0cb-77595be064c5/exporter/0.log" Apr 22 19:20:40.710020 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:40.709986 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zqnzm_37f17e4f-8487-4af2-b0cb-77595be064c5/extractor/0.log" Apr 22 19:20:41.704359 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:41.704327 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-crlms" Apr 22 19:20:42.807353 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:42.807320 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-dd89cc56c-ddt59_32b7588b-a569-4c47-95a7-5ab772e8e085/manager/0.log" Apr 22 19:20:44.184009 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:44.183979 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-54dc496758-2fhhm_95296fed-c059-456c-ac9e-67cb641f1a2f/manager/0.log" Apr 22 19:20:48.501132 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:48.501099 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cnpcv_e2533f65-e508-47ea-9cb7-8bb858479a89/migrator/0.log" Apr 22 19:20:48.520021 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:48.519999 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-cnpcv_e2533f65-e508-47ea-9cb7-8bb858479a89/graceful-termination/0.log" Apr 22 19:20:48.878631 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:48.878548 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-x4rmq_3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb/kube-storage-version-migrator-operator/1.log" Apr 22 19:20:48.880402 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:48.880375 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-x4rmq_3efbd4c5-3c68-4fb4-8a66-b5731e17e5fb/kube-storage-version-migrator-operator/0.log" Apr 22 19:20:49.910835 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:49.910808 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6x6wq_5e2e434e-269f-4708-b72e-607842cf2bd9/kube-multus-additional-cni-plugins/0.log" Apr 22 19:20:49.932318 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:49.932282 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6x6wq_5e2e434e-269f-4708-b72e-607842cf2bd9/egress-router-binary-copy/0.log" Apr 22 19:20:49.955735 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:49.955707 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6x6wq_5e2e434e-269f-4708-b72e-607842cf2bd9/cni-plugins/0.log" Apr 22 19:20:49.980682 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:49.980664 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6x6wq_5e2e434e-269f-4708-b72e-607842cf2bd9/bond-cni-plugin/0.log" Apr 22 19:20:50.008580 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:50.008556 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6x6wq_5e2e434e-269f-4708-b72e-607842cf2bd9/routeoverride-cni/0.log" Apr 22 19:20:50.044790 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:50.044747 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6x6wq_5e2e434e-269f-4708-b72e-607842cf2bd9/whereabouts-cni-bincopy/0.log" Apr 22 19:20:50.099363 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:50.099339 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6x6wq_5e2e434e-269f-4708-b72e-607842cf2bd9/whereabouts-cni/0.log" Apr 22 19:20:50.550364 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:50.550333 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sspfl_366a36aa-b21c-49e8-8ed6-a85ab6ac5d4f/kube-multus/0.log" Apr 22 19:20:50.632405 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:50.632380 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7zmbr_19ace946-23b0-451c-93fa-078938130dd5/network-metrics-daemon/0.log" Apr 22 19:20:50.650620 ip-10-0-130-32 kubenswrapper[2575]: I0422 19:20:50.650597 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7zmbr_19ace946-23b0-451c-93fa-078938130dd5/kube-rbac-proxy/0.log"